We define gravitation as the statistical effect of matter being collapsed by the bosons of vacuum currents from massive bodies.
This perspective goes beyond the classical limits usually associated with gravitation, instead presenting a quantum foundation for the classical effects. It neatly avoids asymptotes at high energies, and exists on the same terms as charge-based effects.
In this representation, gravitation is not a separated force, but is instead simply the proportion of all collapse events that can be attributed to bosons from the massive body. These events are always directed towards that massive body.
The 'direction' of gravitational force is a result of two effects:
Where two bosons collapse into a fermion, the position of that fermion will most likely be located in a direct line between the two sources of the bosons, because that is the first solution that satisfies the quantization condition. Where one of the bosons contributes significantly to the mass of a conserved particle, the particle will be displaced in that direction.
Next we'll consider the direction of the boson sources, relative to the conserved particle.
For the vacuum, bosons come from all directions, so over repeated instances of the particle collapsing due to an omnidirectional vacuum, the average displacement tends to zero.
This means that if a particle is 'free', the background gravitational effects diminish at the classical scale. However, an effect like the Casimir effect may be observed if nearby objects screen the vacuum flux.
When a massive body is nearby, that body will have collapsed and re-radiated many bosons from the vacuum, and although they share the same mechanism as vacuum bosons, they 'compete' with the vacuum's bosons to collapse the test particle.
Where a significant number of these bosons come from a massive body, each such collapse will nudge the particle slightly towards the body. Repeated displacements become significant, and the particle drifts towards the source of the bosons.
A moving particle in this vacuum may interact with more vacuum bosons, if we consider local conditions to be nondegenerate (statistically, if we integrate a wide enough space, we lose local information). This applies to all vacuum flux, supporting the acceleration of a particle to a source, as well as any structural changes to composites.
This helps explain the equivalence of form, of the Unruh effect and Hawking radiation. It relies upon particles interacting with a vacuum having discrete instantiated elements that can be spatially distinguished.
The structure of a conserved composite body is generally 'elastic', and distorts when displacements are applied. This changes the individual collapse events that make up the sequential re-constitution of the composite, conserving the momentum.
Larger composites will need more displacements to achieve velocity, which helps us describe classical inertia and momentum in quantum terms.
While a boson propagates, its mass-energy acts as a phase operator on the waves of other bosons. Consequently, where one boson proceeds to overlap another, its mass-energy widens the phase window for its own collapse, and the collapse of overlapping bosons (a process analogous to the Higgs mechanism).
When these quantum fluctuations are integrated over classical limits, this creates gravitational force as expected, and even permits bosons of zero mass to collapse if massive bosons overlap them.
Vacuum energy will interact with a large body (fig.1: C), and radiate from it as bosons, again as vacuum energy. The more mass-energy body C has, the more vacuum energy it will collapse and re-emit.
As body C’s bosons radiate, some will collapse. With increasing radius, their area for interaction increases, giving a higher probability of collapse from vacuum energy, as per eq.4[15]. Some of the bosons available to test particle A will be environmental vacuum energy, and some will have been emitted by body C. Where bosons from particle C are preferred, this results in a gravitational deflection (or ‘force’). The resulting approximation of gravitational deflection [7] is comparable to classical formulations.
[equation to be inserted]
The mean deflection ∇, which is independent of the mass of the test particle (fig.1: A), is the probability that the test particle will interact with the body’s flux p_b, rather than with the environmental vacuum flux p_v, scaled by the mean expected vector ∇_b between particle events where the particle interacts with the body’s bosons.
Particles we regard as stable may become decoherent in extreme gravitational flux, such as the environment of a black hole.
As for the direction of the interaction, the first opportunity for gravitational interaction is the point directly between the bodies (see arrows meeting, fig.1). Vacuum interactions will be uniformly distributed in all directions, unless there are some very large structures that generate flux. The direction resulting from its own bosons collapsing will depend on its structure, including any structural changes attributed to its momentum.
This problem concerns the compatibility of General Relativity with quantum formulations. Quantization, or the discrepancy in the packaging of values, and what those values represent, is seen as the problem that must be solved in order to bridge between these representations, and provide a continuously valid representation at large and small scales. There currently exists no direct mapping or transition between the two.
Although we do not have a solution for this problem, we can instead show how gravitation manifests in our mechanism, having origins in continuous bosons, how it is quantized as fermion instances, and how statistics may be derived from fundamental mechanics.
When we build statistics, like a gravitational field representation, we will inevitably lose some information about the bosons, or miss some details about the actual emergent behaviour of the target system. For example, our attempt to derive the Newtonian-like approximation (above), provided correlation at large distances, and further, improved upon the Newtonian formulation at short distances. However, it failed to account for actual phase values, which makes the approximation only good for decoherent vacuum currents, and makes some assumptions about the homogeneity of the vacuum.
Only in a truly deterministic model, with all information present, can we obtain an accurate outcome. Unfortunately, this presents some difficulties when we package our outcomes as generalisations, using preconditions that are statistical and emergent properties, or shorthand for more complicated states. Our conventional statistics will not adequately describe the conditions; being statistics, they lose the information that is needed to show the 'quirky' behaviour at the scales where quantum field theory (QFT) proves more useful. Newtonian mechanics is shown to be unrepresentative of reality, and QFT can fall short too.
QFT has its statistical losses: the vacuum is grossly oversimplified because it integrates detail into statistics, losing the instances of vacuum bosons, and replacing them with summarising parameters that were created to explain observations, in terms of deviations from a base model, e.g. coupling constants, expectation values, permittivity, permeability, various fields and corrections, derived potentials, and also free parameters that have no fundamental basis.
The physics community attempts to 'quantize gravitation' in order to reconcile the leading but incompatible representations. Both General Relativity and QFT are statistical in nature, operating from different representations and principles. They will forever be incompatible without a more fundamental intermediary (general) formulation.
If a conventional bridge between the two is possible, it will be formulated using a mechanism like our own, but there will be no direct mapping from one representation to the other; vital information will be lost at each stage. Further, we would not be able to provide the missing information to a statistical representation, without treating this information as corrections to the statistics. That is untidy, and by focusing on the corrections as quantities in their own right, might mislead us about the foundations that are critical to emergent behaviour. Both LQG and string theory are too aligned to their respective foundations to be truly 'bridging' in this respect. Our future work will assume our own foundations, from which the other formulations may be derived as statistics.
Finally, we'd like to say that this 'correction' approach applies not only to quantization, but also to most other aspects of high-energy physics and cosmology that are troubling the physics community.
Gravitational waves are just the same as regular gravitation, manifesting as a directional preference for regular matter to be deflected during its usual reconstitution process, using the flux of gravitation. However, a gravitational wave, believed to have origins in extreme distant events, is a temporary variance in the flux density.
When this flux traverses across a large body, this may locally compress and expand matter within a relatively short length, affecting the distance between mean fermion positions within the body, and its effects should be classically measurable. (a) If we assume flat space, as per our red-shift paper (2014), then an observer distorted in this manner would observe red and blue-shifting of surrounding matter, for the period we are affected by gravitational waves. We could pinpoint the source from the (spatial) directional distribution of spectral shift; (b) lengths within rigid structures would fluctuate.
Measuring it is a statistical calculation of the usual deflections, rather than a straight reading from a field. i.e. you cannot measure it more and more precisely; you'd just find fewer (or no) impulses within a smaller sample length, with regular gravitational interactions as background noise. Thus gravitational waves exhibit their own uncertainty relation, between the imparted momentum and the size of the sample.
We have reservations about whether a unified field theory can include gravitation. If our hypothesis holds, then an independent gravitational field is a fictitious field, not representative of reality, with problematic resolution of gravitation and the charge-based fields.
The answer might instead lie in finding a fundamental mechanism, with uniformly-defined entities, the simplest algebraic abstraction, and very simple rules. From that, we derive the conventional fields statistically from our understanding of their expression from fundamental structures, and with knowledge of the information lost when building such approximations.
In our picture, the collapse of a fermion can be attributed to gravitation (fig. 1, above) when the vacuum component originated from a large body that re-emitted the vacuum flux. That same collapsed fermion may also be partaking in an electromagnetic interaction, which is a sequence of events that conducts a vacuum boson while moving a more massive boson that identifies the matter fermion.
In fig.2, all bosons overlapping the test particle have the same structure. Those partitioned as C and E are attributed to the gravitational flux from a nearby massive body. A are confined bosons, and B and D are bosons from anonymous vacuum energy. Fermions 1 and 2 are assumed to be an example sequence of fermion events within the test particle: respectively a virtual vacuum interaction, and (e.g.) a quark. 'Charge' is the property of conducting flux through the particle.