Showing posts with label space. Show all posts
Showing posts with label space. Show all posts

Sunday, July 25, 2021

#71. The Checkered Universe [Physics]

 PH

Red, theory; black, fact.


Natty dog/epiphany



The basic theoretical vision

☯This is a theory of everything based on a foam model. The foam is made up of two kinds of "bubbles," or "domains": "plus" and "minus." Each plus domain must be completely surrounded by minus domains, and vice versa. Any incipient violation of this rule, call it the "checkerboard rule," causes domains of like type to fuse instantly until the status quo is restored, with release of energy. The energy, typically electromagnetic waves, radiates as undulations in the plus-minus interfaces. The result of many such events is progressive enlargement and diversification of domain sizes. This process, run backward in time, appears to result in a featureless, grey nothingness (imagining contrasting domain types as black and white, inspired by the Yin-and-Yang symbol), thereby giving a halfway-satisfying explanation of how nothing became something. <03-12-2022: Halfway, because it’s an infinite regress: explaining the phenomenon in terms of the phenomenon, and a hint that I am not out of the box yet. Invoking progressively deepening shades of gray in forward time, to gray out and thus censor the regress encountered in backward time, looks like a direction to explore.> I assume a Law of Conservation of Nothingness, whereby plus domains and minus domains must be present in equal amounts, although this can be violated locally. This law is supposed to be at the origin of all the other known conservation laws. The cosmological trend favors domain fusion, but domain budding is nevertheless possible given enough energy.

The givens assumed for this theory of everything.
Since there are givens, it is probably not the real theory of everything
but rather a simplified physics. (But maybe a stepping stone?)


The dimensionality question

The foam is infinite-dimensional. Within the foam, interfaces of all dimensionalities are abundant. Following Hawking, I suggest that we live on a three dimensional brane within the foam because that is the only dimensionality conducive to life. The foamy structure of the large-scale galaxy distribution that we observe thus receives a natural explanation: these are the lower-dimensional foam structures visible from within our brane. The interiors of the domains are largely inaccessible to matter and energy. <03-12-2022: We have an infinite regress again, this time toward increasing dimensionality: we never get to the bulk. Is it time to censor again and postulate progressively lessened contrast with greater dimensionality, and asymptotic to zero contrast? No; the foam model implies a bulk and therefore a maximum dimensionality, but not necessarily three. But what is so special about this maximum dimensionality? Let us treat yin-yang separation as an ordinary chemical process and apply the second law of thermodynamics to see if there is some theoretical special dimensionality. Assuming zero free energy change, we set enthalpy increase (“work”) equal to entropy increase (“disorder”) times absolute temperature. Postulating that separation work decreases with dimensionality and the entropy of the resulting space foam increases with dimensionality, we can solve for the special dimensionality we seek. The separation process has no intrinsic entropy penalty because there are no molecules at this level of description. The real, maximum dimensionality would be greater than theoretical to provide some driving force, which real transformations require. However, is the solution stable? Moreover, the argument implies that temperature is non-zero.><06-26-2022: Temperature is here the relative motion of all the minute, primordial domains. This could be leftover separation motion. How could all this motion happen without innumerable checkerboard-rule violations and thus many fusion events? Fusion events can be construed as interactions, and extra dimensions, which we have here, suppress interactions. More on this below.><02-07-2023: That said, the idea of primordial infinite dimensionality remains beguiling in its simplicity and possibilities.>

12-23-2021: Since infinite dimensionality is a bit hard to get your mind around, let us inquire what is diminished upon increasing the dimensionality, and just set it to zero to represent infinite dimensionality. Some suggestions: order, interaction, and correlation. To illustrate, imagine two 2-dimensional Ardeans* walking toward each other. When they meet, one must lie down so the other can walk over them before either can continue on their way. That's a tad too much correlation and interaction for my taste.

03-03-2022: I suppose that as dimensions are added, order, correlation, and interaction decrease toward zero asymptotically. This would mean that 4D is not so different from 3D as 3D is from 2D. The latter comparison is the usual test case that people employ to try to understand extra dimensions, but it may be misleading. However, in 4D, room-temperature superconductivity may be the rule rather than the exception, due to extradimensional suppression of the interactions responsible for electrical resistance. The persistent, circulating 4D supercurrents, understood as travelling electron waves, may look like electrostatic fields from within our 3-brane, which, if true, would help to eliminate action-at-a-distance from physics. Two legs of the electron-wave circulation would travel in a Direction We Cannot Point. These ideas also lead to the conclusion that electrostatic fields can be diffracted. Bizarre, perhaps, but has anyone ever tried it? <03-21-2022: Yes, they have, and it is the classical electron diffraction experiment. The electrons are particles and are therefore not diffracted; they are accelerated in an electrostatic field that is diffracted, thereby building up a fringe pattern on the photographic plate. The particles, then, are just acting here as field tracers. Slight difficulty: neutrons can also be diffracted. A diffraction experiment requires that they move, however, so read on.>

Still to be explained: Newton's first law (i.e., inertia and motion).


How to accommodate the fact of motion

08-06-2012: Motion can be modelled as the array of ripples that spreads across the surface of a pond after a stone is thrown in. A segment of the wave packet coincides with the many minute domains that define the atoms of the moving object, and moves them along. The foam model implies surface tension, whereby increases in interface area increase the total energy of the domain. If the brane is locally thrown into undulations, this will increase the surface area of the interface and thus the energy. This accounts for the kinetic energy of moving masses. Momentum is the rate of change of kinetic energy with velocity and inertia is the rate of change of momentum with velocity. 

03-08-2022: Brane surface tension would be a consequence of the basic yin-yang de-mixing phenomenon, because increases in interfacial area without volume increase (interface crumpling) can be construed as incipient re-mixing, which would go against the cosmological trend. Thus, the interface always tends to minimum area, as if it had surface tension. <03-04-2023: this tension provides the restoring force that one needs for an oscillation, which is important because waves figure prominently in this theory. However, a wave also needs the medium to have inertia, or resistance to change, and where is that in the present theory? It can be introduced in the form of a finite speed of light. For example, the permeability of free space, related to inductance, a kind of electrical inertia, can be expressed in terms of the speed of light and the permittivity of free space, related to capacitance, which inversely expresses a kind of electrical restoring force.>


Dark matter and how to introduce gravity into this theory

08-14-2021: Gravity is being difficult here. I don't want to replace it with a bunch of spiral branewaves, and I don't know why it has an inverse squared power law. Let's drill down on this: when two interstellar dust grains collide, some kinetic energy is converted to heat energy, which radiates away. Without this process, called inelastic collision, the gravitational accretion of mass, and thus gravity itself, will not be observed. (By the way, one illusion that we may need to shed to make progress is that the convex spaces of the universe, such as the spaces occupied by stars, planets, and galaxies, are fundamental and the negative, concave, between-spaces are just meaningless pseudo-structures, which are called "spandrels" by evolutionary theoreticians. But could it be the other way around? In this, I am using the word "space" in an architectural sense.

08-27-2021: Eureka! Let us suppose that each visible astrophysical object is surrounded by an invisible atmosphere-like structure consisting of mid-sized domains (larger than atomic scale but smaller than intergalactic voids). This could be dark matter. Let us further assume that minimizing the total interfacial area of this structure leads to sorting according to domain size, resulting in a gradient of domain sizes that places the smallest in the center. Therefore, lifting an object off the visible surface necessarily disturbs this minimum-energy structure, requiring an input of energy. This requirement would be the gravitational potential energy of an elevated object. The exact power law remains unexplained, but I think these assumptions bring us much closer to an explanation.

12-29-2021: Ordinary matter would be distinguished from dark matter by the ordinary domains being full of standing waves that store the energy of many historical merging events. The antinodes in the standing wave pattern would be the regular array of atoms thought to make up a crystal (most solid matter is crystalline). The facets of the crystals would correspond to the domain walls. 

<02-10-2023: The waves could be confined inside the ordinary domains by travelling across our 3-brane at right angles, strictly along directions we cannot point. However, something has to leak into the 3-brane to account for electrostatic effects.><02-12-2023: Crossing the 3-brane perpendicularly is possible by geometry if each particle is a hypersphere exactly bisected by the 3-brane, and mass-associated waves travel around the hypersphere surface.><02-14-2023: Presumably, the neutron produces no leakage waves, which could be assured by the presence of a nodal plane coinciding with the spherical intersection of the particle hypersphere with the 3-brane. Electrons and protons could emanate leakage waves, a possibility that suggests the origins of their respective electric fields. However, the fact that these particles have stable masses means that waves must be absorbed from the 3-brane as fast as they exit, meaning that an equilibrium with space must exist. For an equilibrium to be possible, space must be bounded somehow, which is already implied by the foam model. Since we know of only two charged, stable particles, two equilibrium points must exist. This scenario also explains why all electrons are identical and why all protons are identical. If their respective leakage waves are of different frequencies, the two particle types could equilibrate largely independently by a basic theorem of the Fourier transform.><02-19-2023: Particles of like charge would resonate with each other's leakage wave, resulting in a tendency to be entrained by it. This would account for electrostatic repulsion. Particles of opposite charge would radiate at different frequencies and therefore not mutually resonate, leading to no entrainment. However, since each particle is in the other's scattering shadow, it will experience an imbalanced force due to the shadow, tending to push it toward the other particle. This effect could explain electrostatic attraction.><03-14-2023: Gravity may also be due to mutual scatter shadowing, but involving a continuum spectrum of background waves, not the resonant line spectra of charged particles. Note that background waves are not coupled to any domains, and so do not consist of light quanta, which, according to the present theory, are waves coupled to massless domain pairs.><04-21-2023: The bisected hypersphere particle model predicts that subatomic particles will appear to be spheres of a definite fixed radius and having an effective volume 5.71 x greater than expected from the same radius of a sphere in flat space. (5.71=1+1.5x🥧) Background waves that enter the spherical surface will therefore be slow to leave, a property likely to be important for physics.>

03-08-2022: There may be a very close, even mutualistic, relationship between domain geometry and interface waves, all organized by the principle of seeking minimum energy. Atomic nuclei within the crystal could be much tinier domains, also wave-filled but at much shorter wavelengths. The nuclear domains would sit at exactly the peaks of the electron-wave antinodes because these are the only places where the electron waves have no net component in the plane of the interface. Most particles will have the following structure as modelled with reduced dimensionality: a pair of prisms joined at the end-faces, the joint plane coinciding with the 3-brane of our world. Mass is standing waves in the joint plane. One prism is a plus domain and the other a minus domain. Both project out of our 3-brane into cosmologically-sized domains of opposite type. 09-04-2022:  Neutrinos may violate this pattern if they are unpaired domains completely surrounded by the opposite type of space and having no obligatory presence in our 3-brane. This would explain the weakness of their interaction with other types of matter and the existence of more than one type of neutrino. This picture predicts two types but we know three exist, a difficulty for this theory.


Motion, friction, and the cosmological redshift

09-11-2021: How could anything besides waves move from place to place without violating the checkerboard rule? I postulate that domains in front of the moving object are being triggered to merge, with release of wave energy (which radiates away), and domains in the rear are being re-split to restore the status quo. The energy needed for re-splitting will come from the brane-wave packet attached to the object. This accounts for the slowing of moving objects due to friction. However, the moon orbits the Earth apparently without friction and yet is inside the Earth's gravitational field and thus dark-matter structure, and thus must obey the checkerboard rule. My solution is to point out that the moon's motion is not really friction-free because interplanetary space is not really a vacuum, but contains 5 particles of plasma per cubic centimeter. I postulate that each such particle sits at a 0-brane within a space foam made of mid-sized domains.

21-09-2021: This feature ensures the prima facie equivalence of the present theory with conventional accounts of how friction happens, thereby helping the theory pass the test of explanation. However, friction in the absence of a detectable medium made of conventional matter appears to remain as a theoretical possibility.

09-22-2021: Dark-matter friction could progressively slow down electromagnetic oscillations, resulting in the cosmological red shift. The waste heat from this process may account for the microwave background radiation.

12-13-2022: Alternatively, space may really be expanding. Similar to the previously published brane-collision theory, the Big Bang may have been due to contact between two cosmologically sized domains of four spatial dimensions and opposite type, and our 3-brane is the resulting interface. The matter in our universe would originate from the small domains caught between the merging cosmological hyper-domains. This could account for the inflationary era thought to have occurred immediately after the big bang. The subsequent linear expansion of space may be due to the light emitted by the stars; if light is an undulation in the 3-brane along a large extra dimension, then light emission creates more 3-brane all the time, because an undulating brane has more surface area than a flat one. 

* "Overview of Planiverse" page.


Saturday, October 31, 2020

#67. The Trembling-Network Theory of Everything [physics]

PH

Red, theory; black, fact. 



I continue to mine the idea that the world of appearances is simulation-like, in that how we perceive it is strongly affected by the fact that our point of view is inside it, and illusions are rampant.


The slate-of-givens approach is intended to exploit consilience to arrive at a simplified physics that attributes as many phenomena as possible to historical factors and the observer's point of view. Simplified physics is viewed as a stepping-stone to the one, true TOE. The existence of widespread consilience implies that such exists.


The basic theory

The underlying reality is proposed to be a small-world network, whose nodes are our elementary particles and whose links ("edges" in graph theory) are seen collectively as the fields around those particles.

This network is a crude approximation to scale-free, but is structurally only a recursion of three generations (with a fourth in the process of forming), each comprised of two sub-generations, and not an infinite regress. The first generation to form after the big bang was a bunch of triangular networks that we call baryons. In the next generation, they linked up to form the networks underlying light atomic nuclei. These, and individual protons, were big enough to stably bond to single nodes (electrons) to form the network version of atoms. Above the atomic/molecular/electromagnetic level, further super-clustering took on the characteristics of gravitation, whose hallmark seems to be rotation. At the grandest cosmological scales, we may be getting into a fourth "force" that produces the foamy structure of galaxy distribution. The observations attributed to the presence of dark matter may be a sign that, at the intra-galactic scale, the nature of the "fields" is beginning to shift again.

I conjecture that throughout this clustering process, a continuous thermal-like agitation was running through all the links, and especially violent spikes in the agitation pattern could rupture links not sufficiently braced by other, parallel links. This would have been the basis of a trial-and error process of creation of small-world characteristics. The nature of the different "forces" we seem to see at different scales would be entirely conditioned by the type of clusters the links join at that scale, because cluster type would condition the opportunities for network stabilization by cooperative bracing. 


Reconciliation with known science

Formation and rupture of links would correspond to the quantum-mechanical phenomenon of wave-function collapse, and the endless converging, mixing, and re-diverging of the heat signals carried by the network would correspond to the smooth, reversible time-evolution of the wave-function between collapses. The experience of periodic motions would arise from resonances in closed paths embedded in the network. When you see the moon move across the sun in an eclipse, <11-27-2020: no net links are being made or broken; the whole spectacle somehow arises by an energetically balanced creation and rupture of links.>

The photoelectric effect that Einstein made famous can be given a network interpretation: the work function is the energy needed to simultaneously break all the links holding the electron to the cluster that is the electrode, and the observation of an electron that then seems to fly away from the electrode happens by calculation in the remaining network after it has been energized by heat-signal energy in excess of that needed to break the links, reflecting back into the network from the broken ends.

How distance would arise

All the ineffably large number of nodes in the universe would be equidistant from each other, which is possible if they exist in a topological space; such spaces have no distance measure. I think it likely that what you experience as distance is the number of nodes that you contain divided by the number of links connecting the cluster that is you with the cluster that you are observing. It remains to figure out how some of the concomitants of distance arise, such as delay in signal transmission and the cosmological redshift.

Reconciliation with the finite speed of light

11-01-2020: The time-delay effect of distance can be described by a hose-and-bucket model if we assume that all measurements require link breaking in the observer network. The energy received by the measuring system from the measured system is like water from a hose progressively filling a bucket. The delayed overflow of the bucket would correspond to the received energy reaching threshold for breaking a link in the observer network. The fewer the links connecting observer to observed relative to the observer size (i.e., the greater the distance), the slower the bucket fills and the longer signal transmission is observed to take.

11-02-2020: The above mechanism cannot transmit a pulsatile event such as a supernova explosion. It takes not one, but two integrations to convert an impulse into a ramp function suitable for implementing a precise delay. Signal theory tells us that if you can transmit an impulse, you can transmit anything. The second integration has already been located in the observer cluster, so the obvious place in which to locate the first integration is in the observed cluster. Then when the link in the observer cluster breaks, which is an endothermic event, energy is sucked out of both integrators at once, resetting them to zero. That would describe an observer located in the near field of the observed cluster. In the far field, the endothermic rupture would cool only the observer cluster; most of the radiative cooling of the observed cluster would come from the rupture of inter-cluster links, not intra-cluster links. Thus, hot clusters such as stars are becoming increasingly disconnected from the rest of the universe. This can account for the apparent recessional velocity of the galaxies, since I have conjectured that distance is inversely proportional to numbers of inter-cluster links.

Predictions of the fate of the universe

We often hear it said that the reason the night sky is black is that the expansion of the universe is continuously creating more space in which to warehouse all the photons emitted by all the stars. However, the network orientation offers a simpler explanation: inter-cluster links at the grandest scale are being endothermically destroyed to produce the necessary cooling, and the fewer these become, the longer the cosmological distances appear to be. I suppose that when these links are all gone, we all cook. The microwave background radiation may be a harbinger of this. Clearly, my theory favours the Big Rip scenario of the fate of the universe, but a hot Big Rip.

Accounting for the ubiquity of oscillations

05-01-2021: At this point, an improved theory of oscillations can be offered: Oscillating systems feature 4 clusters and thus 4 integrators connected in a loop to form a phase-shift oscillator. These integrators could be modeled as a pair of masses connected by a spring ( = 2 integrators) in each of the observer and observed systems ( = 2 x 2 = 4 integrators).

Motion and gravity

11-30-2020: Motion would be an energetically balanced breaking of links on one side of a cluster and making of links on the other. This could happen on a hypothetical background of spontaneous, random link making and breaking. Acceleration in a gravitational "field" would happen if more links are coming in from one side than the opposite side. More links will correspond to a stronger mutual bracing effect, preferentially inhibiting link breaking on that side. This will shift the making/breaking equilibrium toward making on that side, resulting in an acceleration. <12-11-2020: The universal gravitational constant G could be interpreted as expressing the probability of a link spontaneously forming between any two nodes per unit of time.>

Dimension and direction

01-13-2021: It is not clear how the direction and dimension concepts would emerge from a network representation of reality. If distance emerges from 2-way interactions of clusters, perhaps direction emerges from 3-way interactions and dimension arises from a power law of physical importance versus the number of interacting clusters in a cluster of clusters. This idea was inspired by the fact that four points are needed to define a volume, three are needed to define a plane, and two are needed to define a line.

02-13-2021: Alternatively, angle may be a matter of energetics. Assume that new links form spontaneously at an unalterable rate and only link rupture rate varies. The heat injected by link creation must be disposed of by a balanced rate of link rupture, but this will depend in detail on mutual bracing effects. If your rate of rupture of links to a given cluster is minimal, you will be approaching that cluster. The cluster with which your rupture rate is highest is the one you are receding from. Clusters with which you score average rupture rates will be 90 degrees off your line of travel. The distribution of clusters against angle is predicted from geometry and the appearance of the night sky to be proportional to sin(θ), but a random distribution of rupture rates would predict a bell curve (Gaussian) centered on the average rupture rate. Close, but no cigar. The tails of the Gaussian would produce a sparse zone both fore and aft. Moreover, since there must always be a maximum and minimum, you will always be heading exactly toward some cluster and exactly away from some other: not what we observe.

03-06,07-2021: That the universe is spatially at least three-dimensional can be reduced to a rule that links do not cross. Why the minimum dimensionality permitted by this rule is the one we observe remains to be explained. 

Momentum

Momentum can be explained by attributing it to the network surrounding a cluster, not to the cluster itself. Heat must flow without loss (how?) from in front of a travelling cluster around to the rear (I hope eventually to be able to purge this description of all its directional assumptions), suggesting closed flow-lines through the larger network reminiscent of magnetic field lines. (This is similar in outline to Mach's explanation of momentum, as being due to the interaction of the test mass with the distant galaxies.) It seems necessary to postulate that once this flow pattern is established, it persists by default. An especially large cluster in the vicinity will represent a high-conductivity path for the heat flow, possibly creating a tendency for links to form perpendicular to the line of travel and offset toward the large cluster, which might explain gravitational capture of objects into stable orbits. Finally, the overall universal law would be: heat of link formation = heat of link rupture + increases in network heat content due to increases in network melting point due to increases in mutual bracing efficiency. A simple concept of melting point is the density of triangles in the network. Still to be explained: repulsive forces.

Repulsive forces

04-04-2021: Repulsive forces are only seen with electromagnetism and then only after a local network has been energized somehow. When particles said to be oppositely charged recombine, neutral atoms are re-formed, which creates new triangles and thus increases melting point. The recombination of particles said to be of like charge creates relatively few triangles and is therefore disfavored, creating the impression of mutual repulsion.
 

More on the origin of momentum

Inter-cluster links are not individually bidirectional in their heat conductivity, but a (usually) 50:50 mixture of unidirectional links going each way. Momentum and spontaneous First Law motion become prevalent in classically-sized networks due to small imbalances in numbers of cluster A to cluster B links versus cluster B to cluster A links. This produces a random pattern of spontaneous heat flows across the universe. Converging flows burn out links (and are thus self-limiting) and diverging flows preserve links, causing them to increase in number locally. This process nucleates the gravitational clumping of matter. A directional imbalance in the interior of a cluster causes First Law motion by spontaneously transporting heat from front to back. Front and back are defined by differences in numbers of inter-cluster links (to an arbitrary external cluster) among subsets of cluster nodes.

Case study of a rocket motor

For a rocket motor to work, we have to assume that one of these asymmetrical links can only be burned out by heat applied to its inlet end. During liftoff, the intense heat down in the thruster chambers burns out (unidirectional) links extending up into the remainder of the craft. This leaves an imbalanced excess of links within the rocket body going the other way, leading to a persistent flow of heat downward from the nose cone. This cooling stabilizes links from overhead gravitationally sized clusters ending in the nose cone, causing them to accumulate, thereby shortening the "distance" from the nose cone to those clusters. Meanwhile, the heat deposited at the bottom of the rocket progressively burns out links from the rocket to the Earth, thereby increasing the "distance" between the rocket and the Earth. The exhaust gasses have an imbalanced excess of upward-directed asymmetric links due to the temperature gradient along the exhaust plume that serves to break their connection to the rocket and create the kind of highly asymmetrical cluster required for space travel. <04-11-2021: The details of this scenario all hang together if we assume that link stabilization is symmetrical with link burnout: that is, it is only responsive to what happens at the inlet (in this case, cooling).> Since kinetic energy is associated with motion, the directional link imbalance must be considered a form of energy in its own right, one not sensible as heat as usually understood.

Future directions

05-28-2021: To make further progress, I might have to assume that the links in the universal network are the real things and that the nodes are just their meeting places, which only appear to be real things because this is where the flow of energy changes direction. I then assume that all links are directional and that pairing of oppositely-directed links was actually the first step in the evolution of the universe. Finally, I decompose these directional links into an inlet part joined to an outlet part. With this decomposition, a link pair looks like this:
⚪⚫
⚫⚪
Notice the interesting ambiguity in how to draw the arrows. A purely directional link recalls the one-way nature of time and may represent undifferentiated space and time. A final decision was to treat a repulsive force as a link whose disappearance is exothermic, not endothermic, because this indirectly allows the formation of more of the default kind of link.

Thursday, June 13, 2019

#54. Reality is Virtual but the Host Computer is Analog. [physics, chemistry, neuroscience]


Red, theory; black, fact.
The nucleus around which a TOE will hopefully crystallize.


Here is my idea of what lies behind the veil of space time: something like a Turing machine, but with a multiplicity of tapes and a multiplicity of processors. Each tape corresponds to one elementary particle in space time and the processors implement only Hebb’s rule, a rule of learning first discovered in neuroscience, and that governs correlations among signal sources. The tapes are polymer-like, and their ongoing elongation by polymerization at the ends causes the passage of time. This elongation is associated with a white-noise signal unique to each particle/tape/strand because the monomers are sampled from a population with a Gaussian size distribution.
06-26-2019: A theoretical schema showing the basic strand-processor interaction. The theory borrows from the Turing machine concept, Hebb's rule of learning, and the chemistry of polymerization. Black, one information-bearing polymer strand; blue, processor; red, monomers.
07-03-2019: A more complex, two-strand scheme. The monomer cloud has been omitted for clarity.
07-09-2019: A still more complex, three-strand
scheme. Assumption: at a given point, multiple strands can adhere to the same processor, and vice-versa.

06-26-2019: The first illustration is my current best guess as to what a "Hebb processor" is like, but as we say, "Many questions remain." The short, black lines are the catalyzed ligation points between "monomers," and these are the points of attraction to the processor. If the rear pull-off point encounters a big gap between ligation points, the processor will advance an unusually great distance in one step, creating an unusually long catalytic pocket at the front, which will have a selectivity for longer monomers, thereby implementing a copying mechanism. (Causally, this is backwards, but the alternative seems to involve a messy discussion of chemical equilibria.)

This “machine” is presumed to be naturally occurring and to belong to a particular stage in the physical evolution of the universe. (i.e., I make no appeal to arguments of intelligent design, even by space aliens.) By the anthropic principle, the machine is not necessarily typical of all structures extant at that stage. <06-04-2020: In other words, we are living in a natural subworld that is simulation-like in that the view from within is entirely unlike the view from without, but the two views are lawfully related by something that could be called the "subworld law," an example of which is given below, i.e., "d = K/rxy." (This concept is nothing new because it could also serve in outline as a description of human conscious experience, which is most unlike a mass of neurons signaling to each other in the dark.) Thus, the Theory of Everything could turn out to be a series of nested subworld laws.>

The length of a strand relative to some interior point is a Gaussian white-noise time-series signal with an upward trend just steep enough to eliminate negative slopes. I will deal here with the detrended version of this signal because, on the laboratory distance scale, both the observed system and the observer will share the same trend, preventing its direct observation. Moreover, the polymerization process is postulated to preserve a permanent record of the detrended signal. Therefore, while the future does not exist in this model of time, the past is perfectly preserved. A set of distinguishable time series is called “panel data” and is a Euclidean space by the mathematical definition and can therefore map onto and explain physical space, at least on the laboratory scale.

Imagine some panel data consisting of two time series, X and Y, representing two elementary particles. Take a slice of this data ten samples long and plot them as two ten-dimensional vectors, X and Y. The dot product of these vectors is then easily computed as x₁y₁ + x₂y₂ + … x₁₀y₁₀. A Euclidean space is defined as an affine space (i.e., containing no special points like the origin) where the vector dot product is defined. Recall that this dot product is equal to the length of X times the length of Y times cosθ, where θ is the angle between the vectors. Moreover, cosθ is here equal to râ‚“y, aka Pearson’s r, a measure of correlation between signal sources. Pearson's r is commonly used in the sciences, including neuroscience, and ranges from -1 for negative correlations to +1 for positive correlations; zero indicates no correlation.

I conjecture that rxy represents distance in space time, and vector length represents mass-energy in space time. An rxy of 0 would represent infinite distance and an rxy of 1 would represent 0 distance, a state found only in black-hole singularities or the big bang singularity. Processes would be experienced as close together because they are correlated  (Previously suggested in Post #4.), not correlated because they are close together. The latter is the usual assumption, usually rationalized as being due to the easy exchange of synchronizing photons or virtual photons at close range. However, we seem to be moving here toward the elimination of the light/photon construct from physics. Good riddance; it was always pretty dicey.

(Deprecated, Part 6)
07-25-2015 to 07-27-2019: A simpler possibility is d = K/rxy and rxy = K/d. This change advantageously limits how small space-time distances can become, thereby eliminating infinities from gravity calculations. K is this minimum length. With this revision, the dot product of two vectors in the simulation becomes equal to gravitational binding energy.

No Flying Spaghetti Monster would be complete without meatballs, and these would be the Hebb processors alluded to in the opening paragraph. Each strand would have a processor on each end that may catalyze the polymerization process. Each processor would also be connected to a random point within the mass of strands. This connection moves along the associated strand like a phonograph needle,  reading the recorded signal while keeping pace with the growing end. The processor also has a front read-point in the present. The two read points may or may not be on the same strand. If a correlation is detected between the two read points, the correlation is permanently enhanced by a modification of the polymerization process, per Hebb’s rule. All the orderliness of the universe is supposed to emerge from this one type of calculation.

07-9-2019: At this point, we have not eliminated "space" from physics; we have merely replaced Euclidean space by another kind of space that is less structured, in which the "spaghetti machine" has its being. The new space is a "metric space" but has neither norm nor dot product axiomatically, although they can be present in simulation.The metric is bijective with the counting (natural) numbers and is equal to the number of "primordial time elements" (PTEs) in a polymer of same.

Degrees of correlation less than maximal are encoded in the processor as the number of strands adhering to the processor above the minimum for inter strand copying, namely two. One of these "middle strands," as I shall call them, is illustrated in the sketch of the three-strand scheme, and they putatively degrade the fidelity of the copying process and reduce Pearson's r in proportion to their numbers, while also introducing a time delay into the copying process, due to changes in the length of the processor. A reduction in Pearson's r, which increases the encoded space time distance, simultaneous with an increase in the copying time delay is responsible for the finite speed of light.

7-15-2019: If N is the number of middle strands on a processor, then a reasonable guess as to its effect on Pearson’s r would be r = 1/(N + 1). (Deprecated, Part 6) 07-22-2019: slight problem: the units analysis doesn't work out, so this paragraph is a work in progress.
07-25-2019: The problem is solved if we set the electron binding energy equal to the length of the projection of the electron vector on the proton vector. This is not the dot product and it has the correct units, namely mass-energy. The revised attempt to reproduce Rydberg's formula is:
ΔE = ||e||*(1/n₁ – 1/n₂), n₂ > n₁. (1)

The default interaction implemented by Hebb’s rule would be gravitational attraction. Black hole formation illustrates that gravitation has a built-in positive feedback, and this would derive from the positive feedback built into simple forms of Hebb’s rule.
07-9-2019: To provide my hypothetical Hebb processors with such a positive feedback, I postulate the following: Middle strands come and go in an energy-dependent excision/reintegration process and have a small loop of processor adhering to them when free, which explains how the processor length changes occur. A high-fidelity copying process releases more energy than does a low-fidelity copying process, and excision requires energy input. These ingredients, together with the fidelity-degrading property of a middle strand, should be sufficient for a positive feedback.
If the two read points are on the same strand, the result will be an oscillation. Electromagnetism could be the set of interactions peculiar to oscillating strands. The variance needed to express the oscillation would be deducted from a fixed total, resulting in less variance available to represent distances by its correlations. A smaller distance signal will be more rapidly modified by the Hebb processors, resulting in faster responses to forces and a smaller mass according to Newton’s a = F/m. Thus, we expect neutral particles to be more massive than charged particles, and this tendency is indeed found among the mesons and when comparing the proton and neutron. The relatively great mass of the proton and neutron and the nuclear strong force itself may emerge from a cyclic pattern of three strands (the quarks) connected by three processors. The example of the benzene molecule teaches us to expect novel results from cyclization. (07-31-2019: This will be a huge idea when asking what underlies the metric space alluded to above. 01-17-2020: The metric space may itself be a simulation running on a processor situated in a still simpler space, namely a topological space, in which only a few distinctions matter, such as inside-outside and closed-open. 01-17-2020: The great mass of the baryons may come from the chaos inevitable in the celestial-mechanics version of the three-body problem, but the three bodies would be the three quarks. Recall that in the present theory, noise amplitude corresponds to mass-energy, and fast chaos looks like noise. The three-fold nature of these particles may also create the three-dimensionality of space, but I am having trouble picturing this.

05-22-2020: An extension of this idea would be that chaos is the source of all mass and the properties of a timeline depend on how many highly-correlated others there are ("intimates") and the degree of this correlation. One intimate produces an oscillation but no mass; two or more produce chaos and some amount of mass. Intimacy can be understood as relative, leading to a hierarchy of relationships. The fact that three bodies are the minimum for chaos remains an attractive explanation for the three-dimensionality of space. Details remain elusive, but I am now trying for a holistic vision and no longer focusing narrowly on baryons.

05-22-2020: There may be an alternative kind of middle strand or mode of adhesion that enhances copying fidelity upon adhesion rather than degrading it. This amendment to the theory may be required to model interactions that appear repulsive.

Hebb processors with their rear read points in the distant past would open up long-distance communication channels in space time, giving us the by-now familiar experience of looking millions of years into the past through powerful telescopes, to see galaxies as they were when the universe was young. The communication would be instantaneous, but from an old source; not slow from a new source.

06-18-2019: The big bang:
I conjecture that the universe began in a singularity for a trivial reason: it initially had no way to represent information, let alone correlations, because all the incoming monomers were PTEs, identical in size and having the smallest possible size. A slow direct condensation reaction in the monomer pool then gradually built up larger blocks of PTEs, causing the average size of the items adding to the strands by polymerization to increase progressively. The standard deviation of the size distribution would likewise have increased. Space would have expanded rapidly at first, as predicted by the inflationary hypothesis, because the first little bit of polymerization entropy to develop would have had a disproportionate effect on the system's ability to represent information. The mass-energy of all particles has also been increasing ever since the big bang.

07-25-2019: Therefore, by equation (1), we expect that spectral lines from ancient luminous matter will be redder than comparable lines from contemporary matter, as found by Hubble, which explains the cosmological red shift.

01-17-2020: Another perspective would be that the future is a false vacuum, the past is a true vacuum, and the present is the ever-expanding boundary between. The false-vacuum decay would be driven by entropy increase, not enthalpy (i.e., heat content) decrease (which is allowed by the second law of thermodynamics), because only the interior of the true-vacuum bubble would be occupied by information (i.e., the timelines).

"I could be bounded in a nutshell and count myself a king of infinite space, were it not that I have bad dreams." Hamlet, II.ii

Saturday, June 3, 2017

#30. The Russian-dolls--multiverse Part I [physics]

Matryoshka/pupa
Red, theory; black, fact.

The nucleus around which a TOE will hopefully crystallize.


6-03-2017
I usually assume in these pages that the space we live in has an absolute frame of reference, as Newton taught, and which Einstein taught against. Not only that, but that this frame of reference is a condensate of some sort, rather like the water that a fish swims in.

I also assume that the divide-and-conquer strategy that has served science so well thus far can blithely continue with the (conceptual) dis assembly of this space into its constituent particles. At that point the question arises if these particles are situated in yet another space, older and larger than ours, or if you go direct to spacelessness, where entities have to be treated like Platonic forms. In the former case, one wonders if that older, larger space in turn comes apart into particles situated in a still older and larger, etc, etc, ad infinitum.

I am told that infinities are the death of theories. Nevertheless, let us hold our noses and continue with the Russian Dolls idea, merely assuming that the nesting sequence is not infinite and will not be infinite until the entire multi verse is infinitely old, because the "dolls" form one by one, by ordinary gravitational collapse, from the outside in.

What, exactly, is it that collapses? Call them wave functions, following quantum mechanics. In the previous post, we see that wave functions are slightly particle-like in having a centre of symmetry. In the outermost space, previously called #, the wave crests always move at exactly the speed of light.

7-14-2017
This speed is not necessarily our speed of light, c, but more likely some vastly greater value.

6-03-2017
The space-forming particles of # are themselves aggregates with enough internal entropy to represent integers and enough secondary valences to form links to a set of nearest neighbors to produce a network that is a space. This space acts like a cellular automaton, with signals passing over the links to change the values of the stored integers in some orderly way. The wave functions are the stereotyped, stable figures that spontaneously develop in the automaton out of the initial noise mass left over from catastrophic gravitational collapse, or some abstract, spaceless equivalent. 

Gravity would enter as a geometric effect; impossible at 1D, poorly developed at 2D, commonplace but commonly stalled at extended systems in 3D, and irresistible at 4D and higher (The latter conclusion is based on an anthropic argument in "The Universe in a Nutshell", by Steven Hawking). 

Finally, assume that the dimensionality of a space increases steadily over time, suggesting that the number of links emanating from each node in the underlying network increases slowly but surely. Macroscopically, this dimensionality increase could look something like protein folding. This does not yet explain gravity, a task for another day&&, but static nonlinearities in the automaton's representation system may be involved.*

To facilitate discussion, let us label the Russian-dolls universes from the outside in, in the sequence 1, 2, 3,...etc, and call this number the "pupacity" of a given frame of reference. (From the Latin "pupa," meaning "doll.") Let us further shorten "pupacity" to "p" for symbol-compounding purposes. Thus, the consecutively labelled spaces can be referred to as p1 (our former "#"), p2, p3,... etc.

A final, absolutely crucial assumption is that pn can exhibit global motions ("n" is some arbitrary pupacity), such as rotation, in the frame of reference of p(n-1). Yes, we are talking here about a whole, damned universe rotating as a rigid unit. Probably, it can drift and vibrate as well.

Now, by the assumptions of the previous post, these global motions must be subtracted from the true, outer, speed-of-light speed of the wave crest to produce its apparent speed and direction when seen from within pn. Thus, the universe's love of spinning and orbiting systems of all sizes is explained: a spinning, global-motion vector is being subtracted from the non-spinning, outermost one. As the0-pupacity of our frame of reference increases, more and more of these global vectors are being subtracted, causing the residual apparent motion to get progressively smaller. We would assume under current physics that the wave functions are acquiring more and more mass, to make them go slower and slower, but mass is just a fiction in the scenario presented above. However, the reliance of current physics on the mass construct is a golden opportunity to determine the pupacity of planet Earth.
It is three.

Three, because physics knows of three broad categories of particle mass: the photon, leptons, and baryons. The photon would be native to p1, leptons, such as electrons and positrons, would be native to p2, and baryons, such as protons and neutrons, would be native to p3, our own, dear home in the heavens. 

01-09-2019: it is an interesting coincidence that our pupacity equals the dimensionality of our space. Are dimensionality and pupacity linked during cosmological evolution?&&

6-03-2017
Some interpretations follow. The positron atom would be a standing-wave pattern made up of oppositely rotating wave functions, an electron and a positron, both native to p2. A neutron would be exactly the same thing, but native to p3. Note that both are unstable in isolation.

How is it that we observers in p3 can even detect electrons, say, if those are not native to p3? Because p2 is necessarily older than p3 and has had more time to develop extra dimensions. This will give p3 thin dimensions when seen in the frame of reference of p2, and it is along these thin dimensions that the electrons of p2 approach our own, native protons closely enough to participate in our p3 physics.

Neutron stars would be p4, but I haven't figured out black holes. Just big p4s?

*6-05-2017
or an amplitude-speed coupling.

Wednesday, May 31, 2017

#29. My Second Theory of Everything [physics]

Red, theory; black, fact.

This post comes from considering how wavelike, low-frequency light becomes particle-like, high-frequency light as frequency is smoothly increased. Waves are continuous, whereas particles are discontinuous; how, then, does the breakup occur?

You have to put the source in the picture. Recoil of the source atom sends the wave function off in a specific direction, but the wave function is known to expand (about its center of symmetry?) as it goes. Presumably, it is the vector sum of these two motions that must equal the speed of light; either one is presumably free to take on some lower speed, say, that of a pitched softball. I conjecture that as frequency increases, the particle-like drift of the center progressively dominates the mixture at the expense of the local, wave-like expansion of the wave function about its center. This is how I see waves morphing into particles as the frequency increases. 

These ideas suggest the existence of a unique, watershed frequency at which both motions are equal, and equal to one-half the speed of light when the vectors are aligned. I suspect that this frequency lies in the terahertz range, between radar frequencies and the far infrared, partly on the basis that this seems to be the last part of the electromagnetic spectrum to find technological use. The non-dominance of either the particle or the wave model in this range may translate into a perfect storm of undesirable properties. That comment about the softball, however, suggests the possible existence of easy, classroom experiments with these frequencies that illustrate wave-particle duality.

These considerations brought me to the following set of TOE assumptions, some from relativity theory, some in apparent contradiction of it, and some from quantum mechanics:
  • There is an absolute frame-of-reference, which I shall call "#."
  • All motions seen in this frame of reference will be observed to occur at the speed of light (c); no more, but no less, and only this frame of reference has this property.
  • All speeds lower than c are illusions caused by the motion of the observer's frame of reference.
  • That which moves always at c is not a wave function, but a phase marker of some sort within it, such as a zero crossing or a wave crest.
  • The local wave function evolution relative to its center of symmetry combined with the drift of that center relative to # always travels at c relative to #.
  • If local evolution is an expansion along all wave function radii, you have light; if it is a rotation about the center of symmetry (i.e., motion perpendicular to radii), you have matter.
  • Light wave functions will be like nested spherical shells, whereas matter wave functions will have a lobar, angle-dependent structure like a p-, d-, or f-orbital in theoretical chemistry. The lobes are essential to provide a contrast pattern that could, in principle, be observed to spin.
  • The presence of one axis of rotation produces the neutrino; two simultaneous axes of rotation produce the mesons; three produce the remaining stable particles, e, p, and n. If the three rotational rates are distinguishable, the resulting structure has a handedness.
  • The matter/antimatter dichotomy arises from this handedness, when combined with a law of conservation of spin that would result from space initially being symmetrical. 
  • The mesons should have an ability in 3-space to flip over into their corresponding antiparticles.

Wednesday, March 29, 2017

#26. The Phasiverse [physics]

Red, theory; black, fact.
The nucleus around which a TOE will hopefully crystallize.


3-29-2017
I will be arguing here that our reality, the world of appearances, is encoded in the relative phases of an ineffably large number of oscillators, each of which is a kind of primitive clock.

An early interpretation of the theory of quantum mechanics was that there is a harmonic oscillator somehow assigned to each point in space, and that these account for the matter fields of the universe. Examples of such oscillators (the definition is abstract and mathematical), unsuitable for easy, weekend universe creation, would be masses bouncing up and down on springs, and electronic devices called tank circuits, which are just one capacitor connected across the terminals of one inductor, plus taps on the inductor for getting the energy in. (I am thinking here of the Hartley oscillator, of which I built half a dozen as a teenager.)

If a bunch of such oscillators can communicate with each other (exchange oscillatory energy), this is called coupling, and it can make the oscillators tend to pull each other in to the same, common phase. The Huygens's clocks experiment begins with two old-school pendulum clocks in a case with their pendulums swinging in some random phase relationship. The next day,  mysteriously, the pendulums will always be found swinging in opposite directions. The coupling is evidently due to tiny, rhythmic forces travelling through the common case from clock to clock.

If the coupling is positive, as assumed here, (it's negative in the above experiment), the phase pull-in effect becomes stronger the closer the two phases approach each other, causing a positive feedback effect. This is very reminiscent of Hebb's rule in neuroscience and the tendency of natural attractive forces such as gravity to depend inversely on distance. I have already offered Hebb's rule in these pages as an abstract rule of attraction and binding in a scheme for polymerizing spaceless but noisy "time lines" into a three dimensional network that approximates the space we live in. However, oscillators make better space-forming entities than these "time lines" on a number of counts.

First of all, the phase pull-in effect alluded to above provides a simple answer to questions such as where the organizing principle comes from. All you need to explain is where the oscillators themselves all came from, how they oscillate, and why they are coupled. Since the oscillators begin life in spacelessness, it is hard to see how they could avoid interacting to produce a coupling effect. Second, oscillators need no past or future; they can arise as a succession of causally related nows that alternates between two contrasting forms. (Since we haven't gotten as far as space yet, these would have to be abstract, spaceless entities that smack of yin and yang.) Figures in Conway's game of Life would seem to be examples of this alternation.

What is the time required for such an alternation? The question is meaningless; they just do it. With no past or future, the special status of the present becomes self-explanatory, alleviating some of the cognitive dissonance that goes with the concept of a unified space-time. This space-time, and the even more bizarre idea that it is warped by mass-energy as if embedded in an even higher-dimensional space, starts to look like a device to visualize one's way to solutions to problems that have their origin in unvisualizable spacelessness.

A great many oscillators all with the same phase is not an interesting universe. However, suppose this is impossible because of "train wrecks" happening during the synchronization process that produce frustration of the synchronization analogous to spin frustration in spin glasses. An example would be a cyclic relationship of oscillators in which a wave goes around the loop endlessly. Such cycles may correspond to particles of matter in our universe, and the spiral waves that they would throw off into surrounding space may correspond to the fields around such particles.

A black hole or galaxy would be surrounded by a tremendous number of such radiating fields. The resulting desychronization of the oscillators making up the surrounding space would increase the average phase difference between phasically nearby oscillators, thereby inhibiting their coupling, thereby inhibiting the travel of signals generally through the region. Result: the speed of light is reduced in the vicinity, resulting in the bending of light rays, called gravitational lensing. Notice how easily we derive an effect that formerly required General Relativity.

The next level of description deals with where the oscillators come from.

4-23-2017
Let us jettison the particle model altogether at this point and assume the universe to be made of the waves themselves, with no need for generating objects. These waves might have a tendency to synchronize as a fundamental given. If it is not fundamental, maybe the explanation for it can safely be left to a future generation of physicists. (The image I get at this point is of a series of temporary camps struck during the ascent of some stupendous mountain, for step-wise consolidation of gains, with the grail of the TOE located at the summit.)

As a second thread of this argument, I note that some of the phenomena characteristic of quantum theory can be explained as due to the practicalities of representing functions like waves, practicalities that are always in your face when programming a computer, but never mentioned in the physics I have read so far. In programming, you have to define memory space for all variables, which is always, ultimately, an integer or a set of integers, with both a maximum and a minimum (nonzero) amount that can be represented.

Quantization could be due to the presence of small quantities comparable in size to the value of the least significant bit of an integer-like entity. (Deprecated, Part 4)