Thursday, January 31, 2019

#49. The Reentrant-pathway Theory of Mental Illness [Neuroscience]


Red, theory; black, fact.

This theory is a further development of post #45, “The Denervation supersensitivity Theory of Mental Illness.

The basic idea here is that if a region of cerebral cortex is overgrown relative to a major synaptic partner, not only will it be starved of synaptic input from the partner, but it will also produce excess axons going to that partner that will have difficulty finding enough dendritic territories on which to synapse. Both difficulties can be solved at one stroke, however, if the overgrown area synapses on itself. The logic is similar to the application of valence rules in chemistry.

This mode of repair will produce cyclic signaling pathways (called “reentrant” in Neurospeak) that could support endlessly circulating neural activity. This is therefore an alternative way of getting what I have called “autonomous activity” from disregulated cortical growth, with no need to invoke the phenomenon of denervation supersensitivity. The loop circumnavigation time would have to be long enough to allow for the recovery of any refractory periods that may follow nerve-impulse firing.

The autonomous activity will give rise to hallucinations (called psychotic symptoms) if the re entrant pathway is in sensory cortex, and to manic behavior if in cortex with decidedly motoric functions, which would include planning. Since I have conjectured in these pages that an emotion is a high-level motor command, a re entrant pathway in frontal limbic cortex would produce an apparent emotion disconnected from conscious experience and if in posterior limbic cortex, a hallucinated emotion trigger.

The situation is very similar if a cortical area is normal in size but one of its main synaptic partners is reduced in size by disease. In epileptogenesis, the post-damage remodelling of the local neural networks is known to be associated with new-synapse formation and the sprouting of axon collaterals. The hyperexcitable brain tissue responsible for triggering seizures is known to lie just outside the dead core zone of the damaged region, and can therefore be called “overgrown” relative to the dead zone, which has zero functioning neurons.

All this is compatible with the formation, during the epileptogenesis latent period, of a pair of counter circulating, polysynaptic “ring roads” around the perimeter of the damaged area. This process would be determined by simple rules of valency satisfaction. Both ring roads would be capable of carrying autonomous activity that progresses to a seizure. This might only happen if inhibitory tone is also compromised. Hallucinations and seizures seem to be different grades of the same phenomenon.  Indeed, auditory hallucinations commonly occur in association with temporal-lobe seizures. The temporal lobe is the location of the auditory cortex (Brodmann areas 41 and 42).

Neural net repair leading to seizures.


Monday, December 31, 2018

#48. Science and Proto-science [evolutionary psychology]



Red, theory; black, fact.

Why does religion continue to be so popular in today's supposedly enlightened age? In what category of things should we place religion for purposes of analysis? This is a very important question. The least bad answer that I have come up with is: "Religion is the last protoscience." (By this I mean "classical protoscience"; a contemporary field of study, string theory, has also been labelled "protoscience," a result I base on a DuckDuckGo search on "Is string theory a protoscience?" on 20 Feb, 2022.)

Protoscience is most easily defined by a few well-known examples: alchemy and astrology. These disciplines can be thought of as crude, primordial versions of chemistry and astronomy, respectively, and unable to quickly divest themselves of laughably bad theories, due to an over-reliance on aesthetics as a way to truth.

If religion is a protoscience, that then, is the corresponding science? Will religion someday transform into some kind of super-science, marvelous beyond all prior imagining, and capable of robustly duplicating all the miracles of Christ, just for starters?

08-03-2020: Formerly at this location, now deprecated: Religion is the protoscience of origins and Darwin's theory its successor via the clergyman Malthus. Malthus was one of Darwin's influences, as attested explicitly in the writings of the latter.

07-26-2020: The science that could replace the protoscience religion is likely to be the study of adaptive, distributed, and unconscious behavioral effects in human populations. <07-30-2020: This will be a division within sociobiology focused on human swarm intelligence acting on an historical time scale.> From my own examined experience, I have reason to believe that such things exist. I called them "macro-homeostatic effects" in the post "The Drill Sergeants of the Apocalypse."

Alchemy is thought to have become chemistry with the isolation of oxygen in pure form by Priestly, followed in short order by its recognition as an element by Lavoisier, who had met Priestly in Paris and learned of the new "air" direct from the discoverer. This clue led Lavoisier to a correct theory of the nature of combustion. Priestly published his discovery of oxygen (Lavoisier's term), which he called "dephlogisticated air" (an alchemical term), in letter form, in 1775.

06-28-2019: The corresponding intellectual hand-off from astrology to astronomy seems to have been from Tycho Brae (1546-1601), who seems to have been much involved with astrology, to his onetime assistant Johannes Kepler (1571-1630; "The Legislator of the Heavens"), who derived three famous mathematical laws of planetary motion from Brae's data.

While the former astrology continues to this day as basically a form of amusement and favorite whipping-boy of sophomores everywhere who are just discovering the use of their brains, and the former alchemy has utterly perished (as theory, not practice), religion continues to pay its way down the time stream as a purveyor of a useful approximate theory.

An approximate theory is useful to have if all you need is a quick and dirty answer. The theory that the Earth is flat is an approximate theory that we use every time we take a step. The corresponding exact theory, that the Earth is spherical and gravitating, is only needed for challenging projects such as travelling to the moon.

03-13-2020: Thus, the God hypothesis is the theory of natural selection seen "through a glass darkly." However, the experiences contributing to the formulation of the God hypothesis would have been due to any cause of seemingly miraculous events over the horizon or beyond the reach of individual memory. This would comprise a mixture of the fastest effects of evolution and the slowest effects of synaptic plasticity/learning (e.g., developmental sensitive periods). However, the capacity for learning is itself due to natural selection and learning is, like natural selection, a trial-and-error process. Thus, the two sources of biological order hinting at the existence of God should usually be pulling in the same direction but perhaps with different levels of detail. Modern skepticism about religion seems to be directed at the intellectual anchor point: the God hypothesis. Since I believe that they are best de-faithed who are least de-faithed, let us simply shift the anchor to natural selection and carry on.

I think it premature to abandon classical religion as a source of moral guidance before evolutionary psychology is better developed, and given the usefulness of approximate theories, complete abandonment may never be practical. However, in our day, humanity is beset with many pressing problems, and although atheism appears to be in the ascendent, it may be time to reconcile religion with science, so as not to throw out any babies with the bathwater.

The modes of worship in use in many modern religions may well confer psychological benefits on the pious not yet suspected or articulated by mainstream science. Scientific investigation of the modes of worship that many religions have in common seems in order, especially since they amount to folk wisdom, which is sometimes on the money. Examples of common practices that seem to have potential for inducing neurophysiological changes are prayer, fasting, pilgrimage, incense-burning, and even simple congregating.

Photo by JJ Jordan on Unsplash

Sunday, December 30, 2018

#47. Body-mod Bob's [evolution, evolutionary psychology]




EV     EP     
Red, theory; black, fact.

In the previous post, "Goddesses of the Glacier," evolution appears to be operating in cooperation with a general capacity for technology. Natural selection operates on the brain pathways underlying our aesthetic preferences concerning our own appearance and that of possible reproduction partners and then a technology is automatically developed to satisfy them.

As a first example, consider the oil and brush technology previously assumed for differentiating women from men by hair smoothness. A further step in this direction is to posit that hair color may have been used to code gender. The first step would have been selection for a blond(e) hair colour in both women and men. Since this is a very light colour, it will show the effect of dyeing maximally. Concurrently with this, the aesthetic preferences of men and women would have been differentiated by selection, resulting in blonde women who experience a mild euphoria from being blonde and blond men who experience a mild dysphoria from the same cause. The men would predictably get busy inventing hair-dyeing technologies to rectify this. The necessary dyes are readily obtained from plant sources such as woad and walnut shells. The result would be an effective blonde-female/nonblond-male gender code.

This style of evolution could be very fast if the brain pathways of aesthetic preferences require few mutations for their modification compared with the number required for the equivalent direct modification of the body. Let us assume this and see where it leads.

Faster evolution is generally favored if humans are typically in competition with other species to be the first to occupy a newly possible ecological niche. Such niches will be created in abundance with every dramatic change in the environment, such as a glaciation and the following deglaciation. Possibly, these specific events just slide the same suite of niches to higher or lower latitudes, but the amount of land area in each is likely to change, leading to under-capacity in some, and thus opportunity. These opportunities will vanish much faster than evolution can follow unless a diversity of phenotypes is already present in the prospective colonizing population, which might happen as a result of genetic drift in multiple, isolated sub-populations.

If technologically assisted evolution has general advantages, then we can expect its importance to grow with increases in the reach of technology. Today, we seem to be at a threshold, with male-to-female and female-to-male gender transitions becoming well known. Demand for this service is probably being driven by disordered neural development during fetal life due to contamination of the fetus by environmental pollutants that have estrogenic properties (e.g., bisphenol A, PCBs, phthalates, etc.). The result is the birth of individuals with disordered and mutually contradictory, gendered aesthetic preferences, which is tragic. However, it is an interesting natural experiment.

With further development of cell biology in the direction of supporting body-modification technology, who knows what bizarre hankerings will see the light of day on demand from some customer? Remember that in evolution, the mutation comes first, and the mutation is random. Predictably, and sadly, most such reckless acts of self-indulgence will be punished by reduced employability and reduced reproductive success, doubtless exacerbated by prejudice on the part of the normal, normative majority.

However, the very occasional success story is also to be expected, involving the creation of fortuitously hyperfunctional individuals, and thus the technologically assisted creation of a new pre-species of human.

If the engineering details learned by the body-modification trade during this process are then translated into germ-line genetic engineering, then a true artificial humanoid species will have been created.
After the Pleistocene, the Plasticine.

Photo by Дмитрий Хрусталев-Григорьев on Unsplash

Thursday, December 6, 2018

#46. The Goddesses of the Glacier [evolutionary psychology]

Sorry, not found on Unsplash. (Spirits are flying in bearing mukluks and a parka.)


Red: theory; black, fact.

This post is about long hair, of all things (which I totally dig), and I will argue that humans evolved the trait to keep them warm during geologically recent continental glaciations. (Please pardon the teleological phrasing; I use it here only for the sake of brevity.)

Can having long hair really confer such a benefit under cold conditions? The anecdotal evidence supporting this idea seems abundant. For example, go to the site shown below
for a near-unanimous list of affirmative replies to the question: ‘Does long hair keep you warm?’ One respondent in particular (#32), from Sweden, seems to have exactly reproduced the method for ensuring this that must have been used in eras of glaciation, assuming that our distant forbears could at least make themselves simple parkas out of animal skins.

They would have supplemented this protection by tucking their long hair down inside the parka. Leaving it outside would have been good fashion but bad engineering, since the locks would have been quickly parted by the first gust of wind and precious, life-saving body heat lost.

I began this with mention of goddesses, but of course the males would have had long hair too, probably a meter long, in outrageous violation of modern gender expectations concerning hair length.

I find that this situation suddenly makes better aesthetic sense if you imagine this long hair as tousled in the males and smooth and perfect-looking in the females (and perhaps only in the portion showing above the neckline.) Seen this way, both males and females look gorgeous in the imagination and the aesthetic problem is solved. I call this the "rock-star solution." The females could have smoothed their hair by lubricating it with oil and brushing, and a brush is not hard to make. Meanwhile, the males would only need their fingers for cultivating a charming, insouciant look.

While moderns socially code gender as hair length, this parameter was unavailable to our glaciation-era forebears (the last glaciation maximum occurred 26,000 years ago) because they would have been unwilling to cut their hair, knowing at some level of insight that they needed it for survival and mating success. Therefore, they might have coded gender as hair smoothness as described above. 

I assume that these people were living in some glaciation "refugium," as such terrain is technically called, which is a fortuitously ice-free zone surrounded by continental glacier.

While writing this, I was struck by the amount of detailed information I was able to retrieve from my own aesthetic preferences, some of which would have evolved under stringent, cold-climate conditions to produce mate choices favoring traits with survival value. 

The heart may have its reasons that reason knoweth not, but reason is learning*.

11-26-2019 I have not mentioned beards yet and the main question there is why women do not have them. The answer seems to be that a long beard would interfere with breastfeeding, whereas long scalp hair can be pushed back. Moreover, women have more subcutaneous fat than men, so their thermoregulation problem in a cold climate would not be as severe.

*Based on a famous saying by the philosopher Blaise Pascal.

Sunday, November 18, 2018

#45. The Denervation-supersensitivity Theory of Mental Illness [neuroscience, evolution, genetics]

NE     EV     GE     
Red, theory; black, fact.

People get mental illness but animals seemingly do not, or at least not outside of artificial laboratory models such as the unpredictable, mild-stress rodent model of depression. A simple theory to account for this cites the paleontological fact that the human brain has been expanding at breakneck speed over recent evolutionary time and postulates that this expansion is ongoing at the present time, and that mental illness is the price we are paying for all this brain progress.

In other words, the mentally ill carry the unfavorable mutations that have to be selected out during this progress, and the mutation rate in certain categories of mutation affecting human brain development is elevated in modern humans by some sort of "adaptive" hot-spot system. "Adaptive" is in scare quotes here to indicate that the adaptation inheres in changes in the standard deviation of traits, not the average, and is therefore not Lamarkian.

In brain evolution, the growth changes in the various parts very probably have to be coordinated somehow. I conjecture that there is no master program doing this coordination. Rather, I conceive of the human brain as comprising scores of tissue "parcels," each with its own gene to control the final size that parcel reaches in development. (This idea is consistent with the finding of about 400 genes in humans that participate in establishing body size.) All harmonious symmetry, even left-right symmetry, would have to be painstakingly created by brute-force selection, involving the early deaths of millions of asymmetrical individuals. This idea was outlined in post 10.

Assuming that left and right sides must functionally cooperate to produce a fitness improvement, mutations affecting parcel growth must occur in linked, left-right pairs to avoid irreducible-complexity paradoxes. I have previously conjectured in these pages that the crossing-over phenomenon of egg and sperm maturation serves to create these linked pairs of mutations, where the two mutations are identified with the two ends of the DNA segment that translocates. (See "Can Irreducible Complexity Evolve?")

Most of the evolutionary expansion of the human brain appears to be focused on association cortex, which I conjecture implements if-then rules, like those making up the knowledge bases familiar from the field of artificial intelligence. The "if" part of the rule would be evaluated in post-Rolandic cortex, i.e., in temporal and parietal association cortices, and the "then" part of the rule would be created by the pre-Rolandic association cortex, i.e., the prefrontal cortex. The white matter tracts running forward in the brain would connect the "if" part with the "then" part, and the backward running white-matter tracts would carry priming signals to get other rules ready to "fire" if they are commonly used after the rule in question.

Due to such tight coordination, I would expect that the ideal brain will have a fixed ratio of prefrontal cortex to post-Rolandic association cortex. However, the random nature of the growth-gene bi-mutations (perhaps at mutational hot-spots) permitting human brain evolution will routinely violate this ideal ratio, leading to the creation of individuals having either too much prefrontal cortex or too much temporal/parietal cortex. In the former case, you will have prefrontal cortex starved of sensory input. In the latter case, you will have sensory association cortex starved of priming signals feeding back from motoric areas.

Denervation supersensitivity occurs when the normal nerve supply to a muscle is interrupted, resulting in a rapid overexpression of acetylcholine receptors on the muscle. This can be seen as an attempt to compensate for weak nerve transmission with a tremendous re-amplification of the signal by the muscle. Analogous effects have been found in areas of the cerebral cortex deprived of their normal supply of sensory signals, so the effect seems to be quite general.

In cases of genetically-determined frontal-parietal/temporal imbalance, I conjecture that the input-starved side develops something like denervation supersensitivity, making it prone to autonomous, noise-driven nervous activity.

If the growth excess is in sensory association cortex, this autonomous activity will manifest as hallucinations, resulting in a person with schizophrenia. If the growth excess is in the prefrontal cortex, however, the result of the autonomous activity will be mania or a phobia. Depression may originally have been an adaptation to the presence of a man-eating predator in the neighborhood, but in civilized contexts, it can get activated by the unpredictable (to the sufferer) punishments resulting from manic activity. If the mania is sufficiently mild to co-exist with depression, as in type II bipolar disorder, then the overall effect of the depressive component may be like a band-aid on the mania.

The non-overgrown association cortex might even secondarily develop the opposite of denervation supersensitivity as the result of continual bombardment with autonomous activity from the other side of the Rolandic fissure. This could account for the common observation of hypoprefrontality in cases of schizophrenia.

#44. Sunshine in Covey's Gap [evolutionary psychology, neuroscience]

EP     NE     
Red, theory; black, fact.

Who has not felt frustration at the difficulty and seeming impossibility of making a disagreeable emotion go away before we weaken and act it out, to our detriment? Techniques of true emotional control, i.e., making the bad feelings disappear rather than white-knuckle, open-ended resistance to acting them out, are not impossible, just non obvious. You just have to persuade yourself that this bad is good and believe it.

For the modern person, that second part, the believing, is difficult to achieve robustly if one is using religious solutions to the problem, the domain of soteriology (being "saved"), easier with psychoanalytical solutions, and, I am here to say, easiest of all with scientific solutions. "Believing," for me, means being prepared to bet your life on the truth of a proposition.

Steven Covey writes in "The Seven Habits of Highly Effective People" that between stimulus and [emotional] response, humans have, somewhat metaphorically, a "gap" in the causal chain and animals do not. In the gap, you find such things as imagination, self-awareness, conscience, and self will. He correctly lays tremendous emphasis on this point. George Santayana seems to have grasped this truth when he wrote: "Our dignity is not in what we do but in what we understand. The whole world is doing things." [source, Wiki quotes, accessed 11-06-2018]

Neuroscientist Joseph LeDoux has even elucidated what could be the neural pathways that make Covey's gap possible. A direct pathway from the thalamus to the amygdala mediates the basic fear response but an indirect pathway that leads from thalamus to cerebral cortex to amygdala provides a more nuanced, intelligent amendment to the first response. Full cancellation of the direct pathway by the indirect would account for Covey's gap, and in principle, this could be done by a cortical relay through the inhibitory interstitial neurons of the amygdala that terminate on the amygdalar projection cells.

The doctrines of classical religion probably lead to such cancellation of emotions such as hate and fear by activating the same circuits that are used by a parent to reassure a needlessly fearful infant.

Apparently, classical religion is all about getting people to do the right things for the wrong reasons. When the discipline of evolutionary psychology is sufficiently developed, we can look forward to the age when people do the right things for the right reasons.

Friday, September 7, 2018

#43. A Discovery of Hackers [population, evolutionary psychology]

PO     EP     
Red, theory; black, fact.

9-07-2018: I was saving this for the Sunday before Halloween, but decided that it couldn't wait. The basic idea of this post is that the hacker phenomenon is psychologically and sociologically akin to what was once called witchcraft. Let me hasten to clarify that I am talking about witchcraft the social phenomenon, because I don't believe in anything supernatural. However, the height of the witchcraft hysteria in Europe occurred during the sixteenth century, when there were no computers. (I focus on Europe here because my ancestors came from there as did those of most people I know.) It was, however, a time of unprecedented scientific advance, and if science paced technology then as now, quite a few new technologies were coming into knowledge for the first time.

I suggest that the defining toxic ingredient in black-hat hacking is new technology per se. We should therefore expect that with time, computer hacking will spread to new-technology hacking in general and that the computer-centric version must be considered the embryonic form. This is bad news because there has never been so much new technology as now, but at what point in history has this statement not been true?

Belief in and persecution of witches is so widespread across human cultures that it must be considered a cultural universal. Scholars focus on the persecution part, blithely assuming that there is absolutely nothing real driving it, and that the subject people of the study are, by implication, a bunch of blithering idiots, and sadists to boot. I find this stance elitist. Never judge a man until you have walked a mile in his shoes. These people all have brains in their heads built to the exact same design as our own, and the role of education may be overrated when cultural universals are in play.

I suggest that the defining idea of the witch/technology-hacker (tacker) is viewing new technology as a possible means to increased personal power. To produce a tacker, this idea must be combined with a mad-dog rejection of all morality. 

A technology ideal for tacking/witchcraft must be usable without the identity of the agent coming into general knowledge, and is thus sociologically similar to the ring of Gyges mentioned in Plato's Republic. The anonymity conferred by the Internet makes it one of our worst rings of Gyges, but just wait. More will be discovered in other realms of technology as the hackers branch out, perhaps in unholy alliance with the currently popular Maker movement. Makers, wake up! It's not too early for a manifesto!

How common are Gygean technologies? Hard to say, but it may help to list some.
  • Ionizing radiation was known from the work of Roentgen in 1895 (x-rays) and Villard in 1900 (gamma rays) and for the first time, a means to destroy healthy, living tissue silently and through walls solid enough to conceal all signs of the agent, had become available. (See my blog "Journalist's Progress," at (Link under reconstruction)https://xrra.blogspot.com )
  • The lead pencil, introduced in the sixteenth century already alluded to, was originally made with actual lead metal (instead of graphite and clay mixtures), which we now know to be insidiously neurotoxic, especially to children--knowledge to warm the heart of any proper witch.
  • In the time of Christ in the Middle East, the Roman occupiers knew of ten or so plant-derived poisons, including opium. The very concept of a poison could have been new in those days, and poisons are the classical hard-to-detect weapons. If the weapon is hard to detect, so is the agent. A crypto-technological explanation for some of the events of the New Testament seems possible.
Gygean weapons are doubly "invisible" when based on new technology because these modi operandi are not yet on any body's radar, so the first x number of people who spot them are likely to be disbelieved and their sanity questioned.

Witches have always operated in the zone of perceptual blindness to abuses that transiently opens up after the introduction of any new technology. The psychological invisibility of weapons based on new technology is probably the factor that led witches to become associated with magic. 

Moreover, since the technology is unprecedented in human evolution, the levels of resentment that become inducible in the victims are potentially unprecedented and unphysiologically intense, leading to grotesquely disproportionate punishments being meted out to discovered witches, and this for strings of crimes that would have been extremely serious even considering strictly proportionate punishments. I suspect that the historical accounts of witch-burnings have all been cleaned up for a squeamish readership.

Why were a majority of European witches female? At the height of the anti-witch hysteria, the Black Death was raging and the local human population was probably having trouble keeping its numbers up. On general adaptationist assumptions, all kinds of social forces would have been working to reduce women to baby-making machines, whatever their endowments or aptitudes. This would have created an inevitable push-back in the most intelligent women to reclaim some of their personal power, and witchcraft would have seemed an attractive option for doing this.

Today, the hackers (soon-to-be tackers) are mostly male and the demographic challenge is too many people, not too few. Calhoun's overpopulation experiments on rodents imply that people will become more aggressive if forced to live at higher population densities, and such a relentless increase in aggressiveness may be driving the current reemergence of the witch/tacker. 

It doesn't help that organized religion, the traditional great enemy of witchcraft, is withering on the vine in this country, probably due to the intellectual fallout from Darwin's theory of evolution combined with the failure of the public to understand that a scientific world-view is never finished.

9-08-2018: Proposed definition of "witch": a person in moral free fall under the corrupting influence of technologies that lend themselves to secret abuse for the increase of personal influence.