Red, theory; black, fact.
|
The nucleus around which a TOE will hopefully crystallize. |
Here
is my idea of what lies behind the veil of space time: something
like a Turing machine, but with a multiplicity of tapes and a multiplicity of processors.
Each tape corresponds to one elementary particle in space time and the processors implement only Hebb’s
rule, a rule of learning first discovered in neuroscience, and that governs
correlations among signal sources. The tapes are
polymer-like, and their ongoing elongation by polymerization at the ends causes
the passage of time. This elongation is associated with a white-noise signal
unique to each particle/tape/strand because the monomers are sampled from a
population with a Gaussian size distribution.
|
06-26-2019: A theoretical schema showing the basic strand-processor interaction. The theory borrows from the Turing machine concept, Hebb's rule of learning, and the chemistry of polymerization. Black, one information-bearing polymer strand; blue, processor; red, monomers.
|
|
07-03-2019: A more complex, two-strand scheme. The monomer cloud has been omitted for clarity.
|
|
07-09-2019: A still more complex, three-strand
scheme. Assumption: at a given point, multiple strands can adhere to the same processor, and vice-versa.
|
06-26-2019: The first illustration is my current best guess as to what a "Hebb processor" is like, but as we say, "Many questions remain." The short, black lines are the catalyzed ligation points between "monomers," and these are the points of attraction to the processor. If the rear pull-off point encounters a big gap between ligation points, the processor will advance an unusually great distance in one step, creating an unusually long catalytic pocket at the front, which will have a selectivity for longer monomers, thereby implementing a copying mechanism. (Causally, this is backwards, but the alternative seems to involve a messy discussion of chemical equilibria.)
This “machine” is presumed to be naturally occurring and to
belong to a particular stage in the physical evolution of the universe. (i.e., I
make no appeal to arguments of intelligent design, even by space aliens.) By the anthropic principle, the machine is not necessarily typical
of all structures extant at that stage. <06-04-2020: In other words, we are living in a natural subworld that is simulation-like in that the view from within is entirely unlike the view from without, but the two views are lawfully related by something that could be called the "subworld law," an example of which is given below, i.e., "d = K/rxy." (This concept is nothing new because it could also serve in outline as a description of human conscious experience, which is most unlike a mass of neurons signaling to each other in the dark.) Thus, the Theory of Everything could turn out to be a series of nested subworld laws.>
The length of a strand relative to
some interior point is a Gaussian white-noise time-series signal with an upward
trend just steep enough to eliminate negative slopes. I will deal here with the
detrended version of this signal because, on the laboratory distance scale,
both the observed system and the observer will share the same trend, preventing
its direct observation. Moreover, the polymerization process is postulated to
preserve a permanent record of the detrended signal. Therefore, while the
future does not exist in this model of time, the past is perfectly preserved. A set of distinguishable time series is called “panel data” and is a Euclidean
space by the mathematical definition and can
therefore map onto and explain physical space, at least on the laboratory
scale.
Imagine some panel data consisting of two time series, X and Y, representing two elementary particles. Take a slice of this data ten samples long and plot them as two ten-dimensional vectors, X and Y. The dot product of these vectors is then easily computed as x₁y₁ + x₂y₂ + … x₁₀y₁₀. A Euclidean space is defined as an affine space (i.e., containing no special points like the origin) where the vector dot product is defined. Recall that this dot product is equal to the length of X times the length of Y times cosθ, where θ is the angle between the vectors. Moreover, cosθ is here equal to râ‚“y, aka Pearson’s r, a measure of correlation between signal sources. Pearson's r is commonly used in the sciences, including neuroscience, and ranges from -1 for negative correlations to +1 for positive correlations; zero indicates no correlation.
I conjecture that rxy represents distance in space time, and vector length represents mass-energy in space time. An rxy of 0 would represent infinite distance and an rxy of 1 would represent 0 distance, a state found only in black-hole singularities or the big bang singularity. Processes would be experienced as close together because they are correlated (Previously suggested in Post #4.), not correlated because they are close together. The latter is the usual assumption, usually rationalized as being due to the easy exchange of synchronizing photons or virtual photons at close range. However, we seem to be moving here toward the elimination of the light/photon construct from physics. Good riddance; it was always pretty dicey.
(Deprecated, Part 6)
07-25-2015 to 07-27-2019: A simpler possibility is d = K/rxy and rxy = K/d. This change advantageously limits how small space-time distances can become, thereby eliminating infinities from gravity calculations. K is this minimum length. With this revision, the dot product of two vectors in the simulation becomes equal to gravitational binding energy.
No Flying Spaghetti Monster would be complete without meatballs,
and these would be the Hebb processors alluded to in the opening paragraph. Each strand would have a processor on each end that may
catalyze the polymerization process. Each processor would also be connected to
a random point within the mass of strands. This connection moves along the
associated strand like a phonograph needle, reading the
recorded signal while keeping pace with the growing end. The processor also has
a front read-point in the present. The two read points may or may not be on the
same strand. If a correlation is detected between the two read points, the
correlation is permanently enhanced by a modification of the polymerization
process, per Hebb’s rule. All the orderliness of the universe is supposed to
emerge from this one type of calculation.
07-9-2019: At this point, we have not eliminated "space" from physics; we have merely replaced Euclidean space by another kind of space that is less structured, in which the "spaghetti machine" has its being. The new space is a "metric space" but has neither norm nor dot product axiomatically, although they can be present in simulation.The metric is bijective with the counting (natural) numbers and is equal to the number of "primordial time elements" (PTEs) in a polymer of same.
Degrees of correlation less than maximal are encoded in the processor as the number of strands adhering to the processor above the minimum for inter strand copying, namely two. One of these "middle strands," as I shall call them, is illustrated in the sketch of the three-strand scheme, and they putatively degrade the fidelity of the copying process and reduce Pearson's r in proportion to their numbers, while also introducing a time delay into the copying process, due to changes in the length of the processor. A reduction in Pearson's r, which increases the encoded space time distance, simultaneous with an increase in the copying time delay is responsible for the finite speed of light.
7-15-2019: If N is the number of middle strands on a processor, then a reasonable guess as to its effect on Pearson’s r would be r = 1/(N + 1). (Deprecated, Part 6) 07-22-2019: slight problem: the units analysis doesn't work out, so this paragraph is a work in progress.
07-25-2019: The problem is solved if we set the electron binding energy equal to the length of the projection of the electron vector on the proton vector. This is not the dot product and it has the correct units, namely mass-energy. The revised attempt to reproduce Rydberg's formula is:
ΔE = ||e||*(1/n₁ – 1/n₂), n₂ > n₁. (1)
The default interaction implemented by Hebb’s rule would be gravitational attraction. Black hole formation illustrates that gravitation has a built-in positive feedback, and this would derive from the positive feedback built into simple forms of Hebb’s rule.
07-9-2019: To provide my hypothetical Hebb processors with such a positive feedback, I postulate the following: Middle strands come and go in an energy-dependent excision/reintegration process and have a small loop of processor adhering to them when free, which explains how the processor length changes occur. A high-fidelity copying process releases more energy than does a low-fidelity copying process, and excision requires energy input. These ingredients, together with the fidelity-degrading property of a middle strand, should be sufficient for a positive feedback.
If the two read points are on the same strand, the result will be an oscillation. Electromagnetism could be the set of interactions peculiar to oscillating strands. The variance needed to express the oscillation would be deducted from a fixed total, resulting in less variance available to represent distances by its correlations. A smaller distance signal will be more rapidly modified by the Hebb processors, resulting in faster responses to forces and a smaller mass according to Newton’s a = F/m. Thus, we expect neutral particles to be more massive than charged particles, and this tendency is indeed found among the mesons and when comparing the proton and neutron. The relatively great mass of the proton and neutron and the nuclear strong force itself may emerge from a cyclic pattern of three strands (the quarks) connected by three processors. The example of the benzene molecule teaches us to expect novel results from cyclization. (07-31-2019: This will be a huge idea when asking what underlies the metric space alluded to above. 01-17-2020: The metric space may itself be a simulation running on a processor situated in a still simpler space, namely a topological space, in which only a few distinctions matter, such as inside-outside and closed-open. 01-17-2020: The great mass of the baryons may come from the chaos inevitable in the celestial-mechanics version of the three-body problem, but the three bodies would be the three quarks. Recall that in the present theory, noise amplitude corresponds to mass-energy, and fast chaos looks like noise. The three-fold nature of these particles may also create the three-dimensionality of space, but I am having trouble picturing this.
05-22-2020: An extension of this idea would be that chaos is the source of all mass and the properties of a timeline depend on how many highly-correlated others there are ("intimates") and the degree of this correlation. One intimate produces an oscillation but no mass; two or more produce chaos and some amount of mass. Intimacy can be understood as relative, leading to a hierarchy of relationships. The fact that three bodies are the minimum for chaos remains an attractive explanation for the three-dimensionality of space. Details remain elusive, but I am now trying for a holistic vision and no longer focusing narrowly on baryons.
05-22-2020: There may be an alternative kind of middle strand or mode of adhesion that enhances copying fidelity upon adhesion rather than degrading it. This amendment to the theory may be required to model interactions that appear repulsive.
Hebb processors with their rear read points in the distant past would open up long-distance communication channels in space time, giving us the by-now familiar experience of looking millions of years into the past through powerful telescopes, to see galaxies as they were when the universe was young. The communication would be instantaneous, but from an old source; not slow from a new source.
06-18-2019: The big bang:
I conjecture that the universe began in a singularity for a trivial reason: it initially had no way to represent information, let alone correlations, because all the incoming monomers were PTEs, identical in size and having the smallest possible size. A slow direct condensation reaction in the monomer pool then gradually built up larger blocks of PTEs, causing the average size of the items adding to the strands by polymerization to increase progressively. The standard deviation of the size distribution would likewise have increased. Space would have expanded rapidly at first, as predicted by the inflationary hypothesis, because the first little bit of polymerization entropy to develop would have had a disproportionate effect on the system's ability to represent information. The mass-energy of all particles has also been increasing ever since the big bang.
07-25-2019: Therefore, by equation (1), we expect that spectral lines from ancient luminous matter will be redder than comparable lines from contemporary matter, as found by Hubble, which explains the cosmological red shift.
01-17-2020: Another perspective would be that the future is a false vacuum, the past is a true vacuum, and the present is the ever-expanding boundary between. The false-vacuum decay would be driven by entropy increase, not enthalpy (i.e., heat content) decrease (which is allowed by the second law of thermodynamics), because only the interior of the true-vacuum bubble would be occupied by information (i.e., the timelines).
"I could be bounded in a nutshell and count myself a king of infinite space, were it not that I have bad dreams." Hamlet, II.ii