Po8
Simulation engine

Contents

Why simulation?

My theoretical work has produced emergent qualitative results, and I'm developing simulation software to generate quantitative proof and reconcilable statistics.

Applications

Basic high-energy mechanics

To test small-scale mechanics, I create scenarios with a small set of entities, and stop within a small number of events on the timeline. I verify timeline events within expected limits, in logs or visualisations in a web browser.

Probability distribution functions

These randomize the vacuum, or some other preconditions, and run the simulation many times, to produce a data set of equivalent events, and a map of variant properties. Examples might include: Compton radius of a particle, field statistics, and functions for covariance of properties.

Interpretive phenomena

Where datasets are compared with criteria, I can describe more advanced phenomena. Examples include: probability of decoherence, confidence of escaping a black hole, matter/anti-matter polarization, and decay distributions.

Scripted illustration

With appropriate navigation controls, present the simulation state as an animated diagram that demonstrates phenomena, with emphasis on items of interest.

Challenges

Numerical precision

Real-world applications need high numerical precision to represent phase values of less than 10^-30 of Planck length, on the scale of classical distances. I expect to use a library like BigNum for this, at the expense of performance and inconvenient implementation.

Computing resource and optimisation

To reconcile with standard physics, I'd generate statistical results with sufficient confidence. This needs statistical methods I've not developed, and also computing power and optimisations. I expect the first results will be qualitative and illustrative, followed by low-precision statistical results that I hope will converge on the desired outcome.

Combinatorics at macro scale

The simulation will be sub-optimal at macro scale, and is best suited to high energy scenarios. Because an environmental vacuum contains many overlapping spheres, a large space of low-energy entities presents combinatorial challenges that are not easily optimised solely by space-partitioning. Phase and mass partitioning might help to discard irrelevant combinations of entities.

Free parameters

The simulation has free parameters for example masses of entities. In the reconciled simulation, while I expect fewer free parameters than standard models, I want to replace them with values derived from geometric expressions.

Processing and optimisations

We're not optimising yet, because we haven't yet proved the mechanism works.

The system we're simulating is chaotic and deterministic. The chaotic part means we can't look far into the future without first knowing what happens in the near future, because near-future events can change the system in ways that change the far-future events. We need to process all the events in the order they happen.

The computation is like an n-body problem, in that we need to test all combinations of things to see if they interact. However, it's unlike an n-body problem, because we don't need to infinitely divide time to work out precisely how more than two particles interact. In other words, we don't need to approximate conflicting differential equations. Instead, given the current simulation state, we can calculate all possible interactions with certainty.

We only need to process the next event, and then look at the simulation again to identify further event. This presents opportunities for optimization. We already have an ordered list of possible events. If we edit the list of objects, then we only need to re-examine the ones that changed, against the existing set of objects.

So for example if we process an interaction that collapses two oscillators, then we remove all interactions that involved those two oscillators from the list, and test for any new ones that involve the new oscillator events. If we have 100 particles, and collapse two into a shell, we only have to test around 100 new possible events, rather than 10,000 possible events. We then add the possible events to the list and re-order it by event time, and repeat the whole process.

Methodology

  • A NodeJS implementation to use from command line or web page.
  • A simulation manager, with exit conditions: evolve to time, next event, a number of events.
  • Utility functions, re-used in the implementation, and published to help clients set up simulations, perform isolated tests, and interpret results.
  • Logging and data stream APIs, to provide clients with results.
  • TDD, to confirm results and detect regression errors.
  • Don't optimise early. The main objective is proof-of-concept.

Implementation tasks

Done

  • NodeJS library with utility functions, simulation control, and massless interactions.
  • Automated testing. Tests suites will grow as I add features.
  • Functions to export the current matter network state, in: human-readable text, HTML tables, and SVG graphics.

Next steps

  • Logging and data streams.
  • A web interface to show pre-computed simulation timelines.
  • A web interface to set up simulations, and run them interactively.
  • Graphing component to show simulation timelines as network graphs (similar to the hand-drawn illustrations I made for this site).
  • Graphing component to show probability distribution functions of a simulation with parameterised unknowns. This will use an analytical approach, running the simulation many times, with randomized variables.