Po8
Simulation engine

Contents

Why simulation?

My theoretical work has produced emergent qualitative results, and I'm developing simulation software to generate quantitative proof and reconcilable statistics.

Methodology

  • A NodeJS implementation to use from command line or web page.
  • A simulation manager, with exit conditions: evolve to time, next event, a number of events.
  • Utility functions, re-used in the implementation, and published to help clients set up simulations, perform isolated tests, and interpret results.
  • Logging and data stream APIs, to provide clients with results.
  • TDD, to confirm results and detect regression errors.
  • Don't optimise early. The main objective is proof-of-concept.

Applications

Basic high-energy mechanics

To test small-scale mechanics, I create scenarios with a small set of entities, and stop within a small number of events on the timeline. I verify timeline events within expected limits, in logs or visualisations in a web browser.

Probability distribution functions

These randomize the vacuum, or some other preconditions, and run the simulation many times, to produce a data set of equivalent events, and a map of variant properties. Examples might include: Compton radius of a particle, field statistics, and functions for covariance of properties.

Interpretive phenomena

Where datasets are compared with criteria, I can describe more advanced phenomena. Examples include: probability of decoherence, confidence of escaping a black hole, matter/anti-matter polarization, and decay distributions.

Scripted illustration

With appropriate navigation controls, present the simulation state as an animated diagram that demonstrates phenomena, with emphasis on items of interest.

Implementation

Done

  • NodeJS library with utility functions, simulation control, and massless interactions.
  • Automated testing. Tests suites will grow as I add features.
  • Functions to export the current matter network state, in: human-readable text, HTML tables, and SVG graphics.

Next steps

  • Implement rule 5, allowing mass to widen the window for fermion collapse.
  • Logging and data streams.
  • A web interface to show pre-computed simulation timelines.
  • A web interface to set up simulations, and run them interactively.
  • Graphing component to show simulation timelines as network graphs (similar to the hand-drawn illustrations I made for this site).
  • Graphing component to show probability distribution functions of a simulation with parameterised unknowns. This will use an analytical approach, running the simulation many times, with randomized variables.

Challenges

Numerical precision

Real-world applications need high numerical precision to represent phase values of less than 10^-30 of Planck length, on the scale of classical distances. I expect to use a library like BigNum for this, at the expense of performance and inconvenient implementation.

Computing resource and optimisation

To reconcile with standard physics, I'd generate statistical results with sufficient confidence. This needs statistical methods I've not developed, and also computing power and optimisations. I expet the first results will be qualitative and illustrative, followed by low-precision statistical results that I hope will converge on the desired outcome.

Combinatorics at macro scale

The simulation will be sub-optimal at macro scale, and is best suited to high energy scenarios. Because an environmental vacuum contains many overlapping spheres, a large space of low-energy entities presents combinatorial challenges that are not easily optimised solely by space-partitioning. Phase and mass partitioning might help to discard irrelevant combinations of entities.

Free parameters

The simulation has free parameters for example masses of entities. In the reconciled simulation, while I expect fewer free parameters than standard models, I want to replace them with values derived from geometric expressions.