Most of our work to-date has produced emergent qualitative results, but science demands quantitative proof, so our current research focus is the development of deterministic simulation software to generate reconcilable statistics.
We foresee many useful types of result, and therefore modes for running the simulation. These modes rely on different interfaces for preconditions, exit conditions, and the shape of corresponding results.
These will use a small set of particles, and stop within a small number of events on the timeline. The presence of events within expected bounds will be verified, within low error margins. Possible tests might verify: interaction (or not) of waves having known phases and mass-energy to test whether they collapse into fermion events.
These randomize the vacuum, or some other preconditions, and run the simulation many times, to produce a data set of equivalent events, and a map of variant properties. Examples might include: Compton radius of a particle, field statistics, and functions for covariance of properties.
Where datasets are compared with criteria, we can describe more advanced phenomena. Examples include: probability of decoherence, confidence of escaping a black hole, matter/anti-matter polarization, and decay distributions.
With appropriate navigation controls, the simulation state can be presented as an animated diagram that we can use to demonstrate phenomena. We'll include some emphasis on items of interest.
We are in the early stages of development, where the basic structures of waves, bosons and fermions are implemented, along with some event detection and a basic display of entities.
We see great potential for optimisations, especially for high-energy situations, but we are adopting a 'get it working first; optimise later' approach.
The 'deterministic' mechanism means that for any given inputs, the outputs will be the same every time the functions are run. This allows us to develop our software using unit testing, and TDD.
Our main challenges are: