Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Sunday, May 23, 2021

#69. Storming South [Evolution]

EV

Red, theory; black, fact.

This is a theory of the final stages of human evolution, when the large brain expansion occurred.
At least, they did. Sorry, I don't belong to this species. The Linnaean binomial literally means "wise man." What would be the Latin for "wise guy"?

Homo sapiens: created by ice

H. sapiens appears to have arisen from Homo erectus over the last 0.8 million years due to climate instability in the apparent origin area, namely East Africa. During this time, Europe was getting glaciated every 0.1 million years because of the astrophysical Milankovitch cycle, a rhythm in the amount of eccentricity in the Earth's orbit due to the influence of the planet Jupiter.
However, I am thinking of the hominins who had settled in Europe (or Asia, it doesn't matter for this argument) during the interglacial periods (remember that H. erectus was a great disperser) and when the ice began advancing again, were now facing much worse cooling and drying than in Africa, and thus much greater selection pressures. At least during the last continental glaciation, the ice cap only extended to the Baltic Sea at the maximum, but long before the ice arrives, the land is tundra, which can support only a very thin human population. In any given glaciation, the number of souls per hectare the land could support was relentlessly declining in northern Europe/Asia, and eventually the residents had to get out of Dodge City and settle on land further south, almost certainly over the dead bodies of the former owners. This would have selected Europeans or Asians for warlike tendencies and warfaring skills, which explains a lot of human history. 

Our large brains

However, our large brains seem to be great at something else besides concocting Games of Thrones: that is, environment modification. It's a no-brainer that the first thing someone living in the path of a 2-km wall of ice needs is to keep from freezing to death, and this would have been the first really good reason to modify environments. Unlike chipping a stone axe, environment modification involves fabricating something bigger than the fabricator. Even a parka has to be bigger than you or you can't get into it. This plausibly would have required a larger brain to enable a qualitatively new ability: making something you can't see all at once when it is at working distance.

Our rhythmic evolution

After parkas, early northerners might have evolved enough association cortex (maybe on the next glaciation cycle) to build something a little bigger, like a tent or a lean-to. On the next cycle, they might have been able to pull off a decent longhouse made of wattle. On the next, a jolly good castle surrounded by cultivated lands and drainage ditches. These structures would have delayed the moment of decision when you have to go and take on the Pleistocene-era Low-brows to the south. This will buy you time to build up your numbers, and I understand that winning battles is very much a numbers game. Therefore, environment modification skill would have been selected for in tandem with making like army ants.

Where is the fossil evidence for this theory?

Why do we not find fossil evidence of all this in Europe or Asia? <05-19-2022: Actually, we do: the Neanderthals and Denisovans, who have been difficult to account for in terms of previous theories of human origins.> My scenario can be defended against the inconvenient fossil evidence for a human origin in East Africa in general terms, by citing the well known incompleteness of the fossil record and its many biases, but, of course, I want details. Note, however, what else is in East Africa: the Suez, a land bridge to both Europe and Asia via the Arabian tectonic block, which was created by plate tectonics near the end of the Miocene, thus antedating both H. sapiens and H. erectus. Not only can hominins disperse through it to other continents during interglacials, but they can come back in, fiercer and brainier than before, when the ice is advancing again, to then deposit their fossil evidence in the Rift Valley region of East Africa. The Eurasian backflow event of 3000 years ago may be a relatively recent example of this. The Isthmus of Suez is low-lying and thus easily drowned by the sea, but the probability of this was minimal at times of continental glaciation, when sea levels are minimal. I assume that early hominins expanded like a gas into whatever continent they could access. Increasing glaciation/tundrafication of that continent would have recompressed the "gas" southward, causing it to retrace its path, partly back into Africa. 

Pleistocene selection pressures

To reiterate, this process would have been accompanied by great mortality and therefore, potentially, much selection. Moreover, during the period we are considering, temperatures were declining most of the time; the plot of temperature versus time has a saw-tooth pattern, with long declines alternating with short warming events, and it is the declines that would have been the times of natural selection of hominins living at high latitudes.

Plebius sapiens.

A limestone block in Canada showing scratches left by stones
embedded in the underside of a continental glacier.
The rock has also been ground nearly flat by the same process. Scary.

Glaciated boulder by night. Have a nice interglacial.


Thursday, May 23, 2019

#53. Advanced Human Depopulation Model [Population, Evolutionary Psychology]

Picture 1: A four-stage model of a human depopulation event. C = cycle; growth = growth phase; depop = depopulation

PO     EP     
Red, theory; black, fact


I present in Picture 1 a four-stage model of human depopulation events that is intended to account for more data. The same two emotional programs, the anger cycle and the sadness cycle (see post #41), occur in two "generations," with the second generation having greater violence and using modified signals.
  • Stage 1: depopulation by emigration; accomplishes dispersal of the human species; coordinated by an exchange of anger signals;
  • Stages 2-4: depopulation by mass murder: accomplishes long-term population density confinement within limits;
  • Stage 2: coordinated by an asymmetric exchange of contempt and sadness signals; has similarities with cannibalism;
  • Stage 3: total war program; coordinated by an exchange of anger signals with mimicry added;
  • Stage 4: loss of civilization; triggered by a repudiation of the social contract by trusted elites with grudges: coordinated by increasing paralysis on the part of victims and increasing cynicism on the part of perpetrators. May be too recent an evolutionary development to have an efficient halting signal.

Nevertheless, the modes of worship of Islam are the best place to look for such a signal (06-02-2019: or other remedy) if it exists. 

In this connection, the Islamic prayer discipline has extraordinary potential to alter brain physiology, based on variations in blood flow to this organ, known to be highly sensitive to same. The variations would come about as a result of the highly regimented posture changes occurring during Islamic prayer. I have coded these postures according to the probable effect on blood pressure measured at the brain, and the result looks like this:
Picture 2. The inferred brain physiology of Islamic prayer. Source of data: YouTube, "Time to pray with Zacky," accessed 05-23-2019.

 Shown are my inferred variations in brain oxygenation during two rakat, or units of prayer. Bowing is coded the same as sitting, namely 1. Prostration is coded as 2 and standing is coded as 0. Some forms of Islam prescribe up to 19 rakat per day. Special procedures (Sujud Sahwi) exist for fixing prayers performed erroneously due to "forgetfulness" but this "forgetfulness" I find suggestive of temporary brain dysfunction due to lack of oxygen from getting up too quickly, possibly at about minute 2, above.
06-02-2019: Unprompted revision for clarity and sensitivity: The above observation is to help establish that Islamic prayer manipulates a variable that matters, always an important issue at the outset of a research project. You don't want to waste taxpayer money blindly researching variable after variable and concluding at great expense merely that none of them was relevant.

Monday, December 31, 2018

#48. Science and Proto-science [evolutionary psychology]





Red, theory; black, fact.

Why does religion continue to be so popular in today's supposedly enlightened age? In what category of things should we place religion for purposes of analysis? This is a very important question. The least bad answer that I have come up with is: "Religion is the last protoscience." (By this I mean "classical protoscience"; a contemporary field of study, string theory, has also been labelled "protoscience," a result I base on a DuckDuckGo search on "Is string theory a protoscience?" on 20 Feb, 2022.)

Protoscience is most easily defined by a few well-known examples: alchemy and astrology. These disciplines can be thought of as crude, primordial versions of chemistry and astronomy, respectively, and unable to quickly divest themselves of laughably bad theories, due to an over-reliance on aesthetics as a way to truth.

If religion is a protoscience, that then, is the corresponding science? Will religion someday transform into some kind of super-science, marvelous beyond all prior imagining, and capable of robustly duplicating all the miracles of Christ, just for starters?

08-03-2020: Formerly at this location, now deprecated: Religion is the protoscience of origins and Darwin's theory its successor via the clergyman Malthus. Malthus was one of Darwin's influences, as attested explicitly in the writings of the latter.

07-26-2020: The science that could replace the protoscience religion is likely to be the study of adaptive, distributed, and unconscious behavioral effects in human populations. <07-30-2020: This will be a division within sociobiology focused on human swarm intelligence acting on an historical time scale.> From my own examined experience, I have reason to believe that such things exist. I called them "macro-homeostatic effects" in the post "The Drill Sergeants of the Apocalypse."

Alchemy is thought to have become chemistry with the isolation of oxygen in pure form by Priestly, followed in short order by its recognition as an element by Lavoisier, who had met Priestly in Paris and learned of the new "air" direct from the discoverer. This clue led Lavoisier to a correct theory of the nature of combustion. Priestly published his discovery of oxygen (Lavoisier's term), which he called "dephlogisticated air" (an alchemical term), in letter form, in 1775.

06-28-2019: The corresponding intellectual hand-off from astrology to astronomy seems to have been from Tycho Brae (1546-1601), who seems to have been much involved with astrology, to his onetime assistant Johannes Kepler (1571-1630; "The Legislator of the Heavens"), who derived three famous mathematical laws of planetary motion from Brae's data.

While the former astrology continues to this day as basically a form of amusement and favorite whipping-boy of sophomores everywhere who are just discovering the use of their brains, and the former alchemy has utterly perished (as theory, not practice), religion continues to pay its way down the time stream as a purveyor of a useful approximate theory.

An approximate theory is useful to have if all you need is a quick and dirty answer. The theory that the Earth is flat is an approximate theory that we use every time we take a step. The corresponding exact theory, that the Earth is spherical and gravitating, is only needed for challenging projects such as travelling to the moon.

03-13-2020: Thus, the God hypothesis is the theory of natural selection seen "through a glass darkly." However, the experiences contributing to the formulation of the God hypothesis would have been due to any cause of seemingly miraculous events over the horizon or beyond the reach of individual memory. This would comprise a mixture of the fastest effects of evolution and the slowest effects of synaptic plasticity/learning (e.g., developmental sensitive periods). However, the capacity for learning is itself due to natural selection and learning is, like natural selection, a trial-and-error process. Thus, the two sources of biological order hinting at the existence of God should usually be pulling in the same direction but perhaps with different levels of detail. Modern skepticism about religion seems to be directed at the intellectual anchor point: the God hypothesis. Since I believe that they are best de-faithed who are least de-faithed, let us simply shift the anchor to natural selection and carry on.

I think it premature to abandon classical religion as a source of moral guidance before evolutionary psychology is better developed, and given the usefulness of approximate theories, complete abandonment may never be practical. However, in our day, humanity is beset with many pressing problems, and although atheism appears to be in the ascendent, it may be time to reconcile religion with science, so as not to throw out any babies with the bathwater.

The modes of worship in use in many modern religions may well confer psychological benefits on the pious not yet suspected or articulated by mainstream science. Scientific investigation of the modes of worship that many religions have in common seems in order, especially since they amount to folk wisdom, which is sometimes on the money. Examples of common practices that seem to have potential for inducing neurophysiological changes are prayer, fasting, pilgrimage, incense-burning, and even simple congregating.

Photo by JJ Jordan on Unsplash

Sunday, December 30, 2018

#47. Body-mod Bob's [evolution, evolutionary psychology]




EV     EP     
Red, theory; black, fact.

In the previous post, "Goddesses of the Glacier," evolution appears to be operating in cooperation with a general capacity for technology. Natural selection operates on the brain pathways underlying our aesthetic preferences concerning our own appearance and that of possible reproduction partners and then a technology is automatically developed to satisfy them.

As a first example, consider the oil and brush technology previously assumed for differentiating women from men by hair smoothness. A further step in this direction is to posit that hair color may have been used to code gender. The first step would have been selection for a blond(e) hair colour in both women and men. Since this is a very light colour, it will show the effect of dyeing maximally. Concurrently with this, the aesthetic preferences of men and women would have been differentiated by selection, resulting in blonde women who experience a mild euphoria from being blonde and blond men who experience a mild dysphoria from the same cause. The men would predictably get busy inventing hair-dyeing technologies to rectify this. The necessary dyes are readily obtained from plant sources such as woad and walnut shells. The result would be an effective blonde-female/nonblond-male gender code.

This style of evolution could be very fast if the brain pathways of aesthetic preferences require few mutations for their modification compared with the number required for the equivalent direct modification of the body. Let us assume this and see where it leads.

Faster evolution is generally favored if humans are typically in competition with other species to be the first to occupy a newly possible ecological niche. Such niches will be created in abundance with every dramatic change in the environment, such as a glaciation and the following deglaciation. Possibly, these specific events just slide the same suite of niches to higher or lower latitudes, but the amount of land area in each is likely to change, leading to under-capacity in some, and thus opportunity. These opportunities will vanish much faster than evolution can follow unless a diversity of phenotypes is already present in the prospective colonizing population, which might happen as a result of genetic drift in multiple, isolated sub-populations.

If technologically assisted evolution has general advantages, then we can expect its importance to grow with increases in the reach of technology. Today, we seem to be at a threshold, with male-to-female and female-to-male gender transitions becoming well known. Demand for this service is probably being driven by disordered neural development during fetal life due to contamination of the fetus by environmental pollutants that have estrogenic properties (e.g., bisphenol A, PCBs, phthalates, etc.). The result is the birth of individuals with disordered and mutually contradictory, gendered aesthetic preferences, which is tragic. However, it is an interesting natural experiment.

With further development of cell biology in the direction of supporting body-modification technology, who knows what bizarre hankerings will see the light of day on demand from some customer? Remember that in evolution, the mutation comes first, and the mutation is random. Predictably, and sadly, most such reckless acts of self-indulgence will be punished by reduced employability and reduced reproductive success, doubtless exacerbated by prejudice on the part of the normal, normative majority.

However, the very occasional success story is also to be expected, involving the creation of fortuitously hyperfunctional individuals, and thus the technologically assisted creation of a new pre-species of human.

If the engineering details learned by the body-modification trade during this process are then translated into germ-line genetic engineering, then a true artificial humanoid species will have been created.
After the Pleistocene, the Plasticine.

Photo by Дмитрий Хрусталев-Григорьев on Unsplash

Friday, September 7, 2018

#43. A Discovery of Hackers [population, evolutionary psychology]

PO     EP     
Red, theory; black, fact.

9-07-2018: I was saving this for the Sunday before Halloween, but decided that it couldn't wait. The basic idea of this post is that the hacker phenomenon is psychologically and sociologically akin to what was once called witchcraft. Let me hasten to clarify that I am talking about witchcraft the social phenomenon, because I don't believe in anything supernatural. However, the height of the witchcraft hysteria in Europe occurred during the sixteenth century, when there were no computers. (I focus on Europe here because my ancestors came from there as did those of most people I know.) It was, however, a time of unprecedented scientific advance, and if science paced technology then as now, quite a few new technologies were coming into knowledge for the first time.

I suggest that the defining toxic ingredient in black-hat hacking is new technology per se. We should therefore expect that with time, computer hacking will spread to new-technology hacking in general and that the computer-centric version must be considered the embryonic form. This is bad news because there has never been so much new technology as now, but at what point in history has this statement not been true?

Belief in and persecution of witches is so widespread across human cultures that it must be considered a cultural universal. Scholars focus on the persecution part, blithely assuming that there is absolutely nothing real driving it, and that the subject people of the study are, by implication, a bunch of blithering idiots, and sadists to boot. I find this stance elitist. Never judge a man until you have walked a mile in his shoes. These people all have brains in their heads built to the exact same design as our own, and the role of education may be overrated when cultural universals are in play.

I suggest that the defining idea of the witch/technology-hacker (tacker) is viewing new technology as a possible means to increased personal power. To produce a tacker, this idea must be combined with a mad-dog rejection of all morality. 

A technology ideal for tacking/witchcraft must be usable without the identity of the agent coming into general knowledge, and is thus sociologically similar to the ring of Gyges mentioned in Plato's Republic. The anonymity conferred by the Internet makes it one of our worst rings of Gyges, but just wait. More will be discovered in other realms of technology as the hackers branch out, perhaps in unholy alliance with the currently popular Maker movement. Makers, wake up! It's not too early for a manifesto!

How common are Gygean technologies? Hard to say, but it may help to list some.
  • Ionizing radiation was known from the work of Roentgen in 1895 (x-rays) and Villard in 1900 (gamma rays) and for the first time, a means to destroy healthy, living tissue silently and through walls solid enough to conceal all signs of the agent, had become available. (See my blog "Journalist's Progress," at (Link under reconstruction)https://xrra.blogspot.com )
  • The lead pencil, introduced in the sixteenth century already alluded to, was originally made with actual lead metal (instead of graphite and clay mixtures), which we now know to be insidiously neurotoxic, especially to children--knowledge to warm the heart of any proper witch.
  • In the time of Christ in the Middle East, the Roman occupiers knew of ten or so plant-derived poisons, including opium. The very concept of a poison could have been new in those days, and poisons are the classical hard-to-detect weapons. If the weapon is hard to detect, so is the agent. A crypto-technological explanation for some of the events of the New Testament seems possible.
Gygean weapons are doubly "invisible" when based on new technology because these modi operandi are not yet on any body's radar, so the first x number of people who spot them are likely to be disbelieved and their sanity questioned.

Witches have always operated in the zone of perceptual blindness to abuses that transiently opens up after the introduction of any new technology. The psychological invisibility of weapons based on new technology is probably the factor that led witches to become associated with magic. 

Moreover, since the technology is unprecedented in human evolution, the levels of resentment that become inducible in the victims are potentially unprecedented and unphysiologically intense, leading to grotesquely disproportionate punishments being meted out to discovered witches, and this for strings of crimes that would have been extremely serious even considering strictly proportionate punishments. I suspect that the historical accounts of witch-burnings have all been cleaned up for a squeamish readership.

Why were a majority of European witches female? At the height of the anti-witch hysteria, the Black Death was raging and the local human population was probably having trouble keeping its numbers up. On general adaptationist assumptions, all kinds of social forces would have been working to reduce women to baby-making machines, whatever their endowments or aptitudes. This would have created an inevitable push-back in the most intelligent women to reclaim some of their personal power, and witchcraft would have seemed an attractive option for doing this.

Today, the hackers (soon-to-be tackers) are mostly male and the demographic challenge is too many people, not too few. Calhoun's overpopulation experiments on rodents imply that people will become more aggressive if forced to live at higher population densities, and such a relentless increase in aggressiveness may be driving the current reemergence of the witch/tacker. 

It doesn't help that organized religion, the traditional great enemy of witchcraft, is withering on the vine in this country, probably due to the intellectual fallout from Darwin's theory of evolution combined with the failure of the public to understand that a scientific world-view is never finished.

9-08-2018: Proposed definition of "witch": a person in moral free fall under the corrupting influence of technologies that lend themselves to secret abuse for the increase of personal influence.

Wednesday, April 4, 2018

#38. The Fallacy of Justice [evolutionary psychology]

Red, theory; black, fact.

4-04-2018: In my treatment of evil and criminality so far, I have tried to show that they sub serve either dispersal or preemptive population reduction, both valuable biological processes that tend to prolong the survival of species. 

The algorithms for achieving these ends would have been created over time by some form of evolution, with probably a large component coming from a hypothetical, fast form of evolution I call post-zygotic gamete selection (PGS), where gametes -- individual cells -- are effectively the units of selection. In general, the smaller the unit of selection, the faster the adaptation. PGS may have accelerated evolution to the point where it could be detected by simple record-keeping technologies, which may have led to the first record-keeping peoples eventually realizing that "someone is looking out for us," leading to the invention of monotheism.

The genetically inherited parts of our behavior enter consciousness as emotions, and can therefore be easily identified. The main outlines of civilization are probably due to the inherited behavior component, and not to the reasoning, conscious mind, which is often just a detail-handler. How could civilization rest on a process that can't even remember what happened last weekend?

Thus, humans have a dual input to behavior, emotion and reason. The above arguments show that evil and criminality come from the emotional input. Yet the entire deterrence theory of justice assumes the opposite, by giving the person a logical choice: "You do this, we do that, and you won't like it. So you don't do this, right?"

I'm not so sure. People commit crimes for emotional reasons. As usual, the criminal's reasoning faculties are just an after-the-decision detail handler. The direction that this detail handler then takes is fascinatingly monstrous, but this does not mean that crime begins in reason.

Conclusion: the deterrence theory of justice is based on a category error.

Religion, with its emphasis on emotion, was all the formal "law enforcement system" anyone needed up until only about 200 years ago, at the industrial revolution. We may be able to go beyond where religion takes us by means of a disease model of criminality.

It does make some sense to lock criminals up, because with less freedom they cannot physically commit as many crimes. Many prisons become dungeons, however, because of the public's desire for revenge. However, all revenge-seeking belongs to the dispersal/depopulation dynamic and is thus part of the problem. A desire for revenge may follow a crime very predictably, but logically, it is a non-sequitur.

4-30-2018: A more nuanced theory of crime prevention is possible, where logical and technological constraints on behavior complement efforts to reduce the motivation for committing crimes at the source: the individual's perception of the fairness of society. However, I originally wrote as I did because I don't think that the former is the squeaky wheel at the moment.

Sunday, June 18, 2017

#32. Climate Change [engineering]

Red, theory; black, fact.

Reading "Just Cool It!" by Suzuki and Hanington introduced me to the ancient terra preta agricultural technology, given as a possible solution, or part of the solution, to the global warming problem. The term is apparently Portuguese for "black earth" and the technology involves enriching the soil by ploughing it full of charcoal. Suzuki and Hanington make the point that this should sequester a lot of carbon in the soil, thereby taking it out of the atmosphere. Charcoal is nearly pure carbon. Moreover, charcoal, being indigestible to decay organisms, should stay in the soil for a very long time. The logical raw material for making the charcoal would be either wood from clearing the land for agriculture, or crop residues, the parts of the crop plant that people cannot eat.

I would add here that in modern pyrolysis plants, not only is charcoal produced, but also flammable off-gasses, which could be used for fuel directly in some future scenario, or catalytically reformed to a liquid fuel for running the tractors and combines. In gaseous form, the fuel could run a steam turbine to produce electricity to supplement that from wind farms, hydro, tidal, geothermal, thorium-nuclear, and photovoltaics.

However, the off-gasses are also used to fuel the pyrolysis plant itself. Whether any would be left over for other uses would depend on careful plant design for energy efficiency and on avoiding fuelled drying operations. Thus, the feedstock should be sun-dried.

Schemes like second-generation power ethanol are touted as carbon-neutral, but in terra preta with these additions, we have one that is actually carbon-negative.

However, the soil ends up black. No other color is as efficient at converting sunlight into heat, which we don't need more of at this point. This seems to be a problem with the terra preta solution. (The ideal color for avoiding heat production would be white.)

Note that the use of any terrestrial artificial mirror membrane has the drawback that the membrane will get dirty rapidly from dust, pollen, and plant parts, thereby reducing its efficiently. However, a living means of light reflection would renew itself, gratis, each year. Orbiting space mirrors have also been proposed as the solution to global warming. They wouldn't get covered in pollen anytime soon -- just shot full of holes by micrometeorites.

Sunday, February 12, 2017

#24. The Pictures in Your Head [neuroscience]

Red, theory; black, fact.

My post on the thalamus suggests that in thinking about the brain, we should maintain a sharp distinction between temporal information (signals most usefully plotted against time) and spatial information (signals most usefully plotted against space). Remember that the theory of General Relativity, which posits a unified space-time, applies only to energy and distance scales far from the quotidian.

In the thalamus post, I theorized about how the brain could tremendously data-compress temporal information using the Laplace transform, by which a continuous time function, classically containing an infinite number of points, can be re-represented as a mere handful of summarizing points called poles and zeroes, scattered on a two-dimensional plot called the complex frequency plane. Infinity down to a handful. Pretty good data compression, I'd say. The brain will tend to evolve data-compression schemes if these reduce the number of neurons needed for processing (I hereby assume that they always do), because neurons are metabolically expensive to maintain and evolution favors parsimony in the use of metabolic energy.

Ultimately, the efficiency of the Laplace transform seems to come from the fact that naturally-occurring time functions tend to be pretty stereotyped and repetitious: a branch nodding in the wind, leaves on it oscillating independently and more rapidly, the whole performance decaying exponentially to stillness with each calming of the wind; an iceberg calving discontinuously into the sea; astronomical cycles of perfect regularity; and a bacterial population growing exponentially, then shifting gears to a regime of ever-slowing growth as resources become limiting, the whole sequence following what is called a logistic curve.

Nature is very often described by differential equations, such as Maxwell's equations, those of General Relativity, and Schrodinger's Equation, the three greats. Other differential equations describe growth and decay processes, oscillations, diffusion, and passive but non-chemically energy-storing electrical and mechanical systems. A differential equation is one that contains at least one symbol representing the rate of change of a first variable versus a second variable. Moreover, differential equations seem to be relatively easy to derive from theories. The challenge is to solve the equation, not for a single number, but for a whole function that gives the actual value of the first variable versus the second variable, for purposes of making quantitative, testable predictions, thereby allowing testing of the theory itself. The Laplace transform greatly facilitates the solution of many of science's temporal differential equations, and these solutions are remarkably few and stereotyped: oscillations, growth/decay curves, and simple sums, magnifications, and/or products of these. Clearly, the complexity of the world comes not from its temporal information, but from it's spatial information. However, spatial regularities that might be exploited for spatial data compression are weaker than in the temporal case.

The main regularity in the spatial domain seems to be hierarchical clustering. For an example of this, let's return to the nodding branch. Petioles, veins, and teeth cluster to form a leaf. Leaves and twigs cluster to form a branch. Branches and trunk cluster to form a tree. Trees cluster to form a forest. This spatially clustered aspect of reality is being exploited currently in an approach to machine intelligence called "deep learning," where the successive stages in the hierarchy of the data are learned by successive hidden layers of simulated neurons in a neural net. Data is processed as it passes through the stack of layers, with successive layers learning to recognize successively larger clusters, representing these to the next layer as symbols simplified to aid further cluster recognition. This technology is based on discoveries about how the mammalian visual system operates. (For the seminal paper in the latter field, see Hubel and Wiesel, Journal of Physiology, 1959, 148[3], pp 574-591.)

Visual information passes successively through visual areas Brodmann 17, 18, and 19, with receptive fields becoming progressively larger and more complex, as would be expected from a hierarchical process of cluster recognition. The latter two areas, 18 and 19, are classed as association cortex, of which humans have the greatest amount of any primate. However, cluster recognition requires the use of neuron specialist sub-types, each looking for a very particular stimulus. To even cover most of the cluster-type possibilities, a large number of different specialists must be trained up. This does not seem like very good data compression from the standpoint of metabolic cost savings. Thus, the evolution of better ability with spatial information should require many more new neurons than with the case of temporal information.

My hypothesis here is that what is conferred by the comparatively large human cerebral cortex, especially the association cortices, is not general intelligence, but facility with using spatial information. We take it on and disgorge it like water-bombers. Think of a rock-climber sizing up a cliff face. Think of an architect, engineer, tool-and-die maker, or trades person reading a blueprint. Now look around you. Do we not have all these nice buildings to live and work in? Can any other animal claim as much? My hypothesis seems obvious when you look at it this way.

Mere possession of a well developed sense of vision will not necessarily confer such ability with spatial information. The eyes of a predatory bird, for instance, could simply be gathering mainly temporal information modulated onto light, and used as a servo error for dynamically homing in on prey. To make a difference, the spatial information has to have someplace to go when it reaches the higher brain. Conversely, our sense of hearing is far from useless in providing spatial information. We possess an elaborate network of brain-stem auditory centers for accomplishing exactly this. Clearly, the spatial/temporal issue is largely dissociable from the issue of sensory modality.

You may argue that the uniquely human power of language suggests that our cortical advantage is used for processing temporal information, because speech is a spaceless phenomenon that unfolds only in time. However, the leading theory of speech seems to be the Wittgenstein picture theory of meaning, which postulates that a statement shows its meaning by its logical structure. Bottom line: language as currently understood is entirely consistent with my hypothesis that humans are specialized for processing spatial information.

Since fossil and comparative evidence suggests that our large brain is our most recently evolved attribute, it is safe to suppose that it may be evolving still, for all we know. There may still be a huge existential premium on possession of improved spatial ability. For example, Napoleon's strategy for winning the decisive Battle of Austerlitz while badly outnumbered seems to have involved a lot of visualization. The cultural face of the zeitgeist may reflect this in shows and movies where the hero prevails as a result of superior use of spatial information. (e.g., Star Wars, Back to the Future, and many Warner Bros. cartoons). Many if not most of our competitive games take place on fields, courts, or boards, showing that they test the spatial abilities of the contestants. By now, the enterprising reader will be thinking, "All I have to do is emphasize the spatial [whatever that means], and I'll be a winner! What a great take-home!"

Let me know how it goes, because all this is just theory.

Sunday, October 30, 2016

#19. Explaining Science-religion Antipathy also Explains Religion [evolutionary psychology]

Red, theory; black, fact.

I will be arguing here that the Darwinian selective advantage to humans of having a propensity for religion is that it regulates the pace of introduction of new technology, which is necessitated by the disruptive side effects of new technology.

If this sounds like a weak argument, perhaps people have been chronically underestimating the costs to society of the harmful side effects of new technology, ever since there have been people. Take the downside of the taming of fire, for instance. You can bet that the first use would have been military, just as in the case of nuclear energy. Remember that everything was covered in forests in those days; there must have been an appalling time of fire until kin selection slowly put a stop to it. The lake-bottom charcoal deposits will still be there, if anyone cares to look for them. (Shades of Asimov's story "Nightfall.")

The sedimentary record does not seem to support the idea that the smoke from such a time of fire caused a planetary cooling event sufficient to trigger the last ice age. However, the mere possibility helps to drive home the point, namely that prehistoric, evolutionary-milieu technology was not necessarily too feckless to produce enough disruption to constitute a source of selection pressure.

Natural selection could have built a rate-of-innovation controller by exaggerating people's pleasure at discovering a new, unexplored phenomenon, until they bog down in rapture at that moment and never progress to the next step of actually experimenting or exploring. The latter activities would be just upstream of the nominally controlled process, the introduction of new technology. People's tendency for "rapture capture" would be causally linked via genetically specified neural pathways to the kinds of hardships caused by technological side effects, thereby completing a negative feedback loop that would work like a steam engine governor.

I conjecture that all present-day religions are built on this phenomenon of "rapture capture." This may explain why the most innovative country, the USA, is also the most religiose, according to Dawkins, writing in "The God Delusion." An Einsteinian sense of wonder at the cosmos that, according to Dawkins, most scientists feel, could be a mild, non-capturing version of the same thing. The unlikely traits attributed to God, omnipotence, omni this and that, could have instrumental value in intensifying the rapture.

Another possible name for what I have been calling rapture could be "arcanaphilia." A basic insight for me here was that religion is fundamentally hedonistic. I do not seem to be straying too far from Marx's statement that "Religion is the opiate of the people."

These ideas help to explain why some sciences such as astronomy and chemistry began as inefficient protosciences (e.g., astrology, alchemy): they were inhibited from the start by an excessive sense of wonder, until harder heads eventually prevailed (Galileo, Lavoisier). Seen as a protoscience, the Abrahamic religions could originally have been sparked by evidence that "someone is looking out for us" found in records of historical events such as those the ancient Israelites compiled (of which the Dead Sea Scrolls are a surviving example). That "someone" would in reality be various forms of generation-time compensation, one of which I have been calling the "intermind" in these pages. Perhaps when the subject of study is in reality a powerful aspect of ourselves as populations, the stimulus for rapture capture will be especially effective, explaining why religion has yet to become an experimental science.

By the way, there is usually no insurmountable difficulty in experimenting on humans so long as the provisions of the Declaration of Helsinki are observed: volunteer basis only; controlled, randomized, double-blind study; experiment thoroughly explained to volunteers before enrollment; written consent obtained from all volunteers before enrollment; approval of the experimental design obtained in advance from the appropriate institutional ethics committee; and the experiment registered online with the appropriate registry.

Religions seem to be characterized by an unmistakable style made up of little touches that exaggerate the practitioner's sense of wonder and mystery, thus, their arcanaphilic "high." I refer to unnecessarily high ceilings in places of worship, use of enigmatic symbols, putting gold leaf on things, songs with Italian phrases in the score, such as "maestoso," wearing colorful costumes, etc. I shall refer to all the elements of this style collectively as "bractea," Latin for tinsel or gold leaf. I propose the presence of bractea as a field mark for recognizing religions in the wild. By this criterion, psychiatry is not a religion, but science fiction is.

It seems to me that bractea use can easily graduate into the creation of formal works of art, such as canticles, stained glass windows, statues of the Buddha, and the covers of science fiction magazines. Exposure to concentrations of excessive creativity in places of worship can be expected to drive down the creativity of the worshipers by a negative feedback process evolved to regulate the diversity of the species memeplex, already alluded to in my post titled, "The Intermind: Engine of History?"

This effect should indirectly reduce the rate of introduction of new technology, thereby feeding into the biological mission of religion. Religion could be the epi-evolutionary solution, and the artistic feedback could be the evolutionary solution, to the disorders caused by creativity. Bractea would represent a synergy between the two.