Introduction

In collaboration with the ubld.it team, HeartMath has designed and built a new generation of state-of-the-art random number generators (NextGen RNGs) for the purpose of detecting collective consciousness on the GCP 2.0 network. The initial deployment of this project includes the largest RNG network in the world, comprising 1000 devices owned by citizen scientists internationally. These devices were designed by experts in cryptography and computer science who understand how to generate truly random numbers and manage them at such a large scale. Each device is the product of years of careful design. They all were developed for use in research as high quality sources that produce random data meeting stringent criteria.

FAQS

An RNG is a device that generates random numbers. In a sense, it is like an electronic coin flipper, only instead of generating Heads or Tails, it generates a 0 or 1 bit. These bits are generated at high speed and combined to produce many random numbers every second.

The RNGs are designed to generate truly random numbers. Quantum random numbers are extracted from the random behavior of electrons in the circuit, via the quantum avalanche process across a Zener diode. The numbers are whitened to further clean out any other influences or biases coming from environmental effects such as the temperature of the room, or electromagnetic influences or changes in the input power from the local power grid. That way, when we do see non-random effects, we attribute it to unconventional causes like collective consciousness.

Inside the RNGs are electronic circuits with several zener diodes, which are semiconductors that typically only allow electric currents to flow in one direction. The diodes are reverse biased, which means that a voltage is applied that pushes electrons opposite to the diode’s preferred direction. The electrons pile up against the diode’s barrier until there are enough to set off a quantum avalanche breakdown. Think of it like snow on a mountain that piles up until even a very small disturbance sets off a cascading avalanche of snow. In the RNG, when enough electrons pile up, a few will randomly cross the barrier, knocking other electrons into motion until the entire pile of electrons has crossed. Then, the build-up of electrons restarts until another avalanche is triggered. This process is very fast; more than one million avalanches occur per second.

The avalanches are not identical. Some pile up more electrons than others, just like snow may pile up in different amounts before an avalanche is triggered. In the case of electrons, there are several factors that determine when the avalanche is set off, and one of these is quantum randomness. This is truly random, and we exploit this to generate truly random numbers. The size of the electron pile-up can be measured by the voltage they generate, and this voltage is a number that is captured at an aggregate level per second.

The voltage-based numbers are generated by quantum randomness but other factors as well. Those include other sources of randomness as well as bias due to the environment and electronic components, such as electromagnetic fields, temperature, and component aging. The most prominent bias in many analyses is the temperature inside the RNG, which typically follows the ambient temperature around it. As we have learned from the original GCP 1.0 project (History), another factor influencing the numbers is our collective consciousness.

We need to reduce the effects of non-random sources like temperature enough that the numbers are mostly random with some detectable influence from collective consciousness. This is achieved by whitening the numbers, a well-known process in the RNG community for ensuring true randomness. It is a way of scrambling the numbers further to eliminate non-random effects. There are several approaches to whitening, and these RNGs employ very similar techniques to the ones used in the ORION RNGs from the GCP 1.0, as these have already been proven for consciousness detection. Had we used another approach, there could be a concern of either not enough removal of non-randomness, or else cleaning it out so well that even the effects of collective consciousness are removed.

There are three steps of whitening, each one further randomizing the numbers. These are accepted techniques for improving randomness, and we look at random numbers generated after each of these steps:

  1. The numbers go through a D flip-flop, which generates bits i.e. ones and zeroes (or more precisely high vs. low in the analog circuits). It continuously generates only a one until there is a rising edge i.e. the voltage number rises from below average to above average. At that point, it flips to generating only a zeroe until the next rising edge. Then it flops back to ones, and so on. These flips occur frequently and randomly, more than one million times per second.
  2. The bits from a fixed pair of diodes within the devices are combined via XOR against each other. There are several diode pairs in each device. Bits are captured from each diode at a rate of 10,000 per second, or the sampling rate is 0.1 milliseconds (ms). The XOR process is to take the bits from the pair of diodes at each 0.1 ms and check whether they are the same as each other. If yes, the outcome is a zero bit; if not a one. This results in a single stream of 10,000 random bits per second from each diode pair.
  3. The resulting bits are combined with an alternating series of ones and zeros (101010…) via XOR to further randomize them.
  4. After whitening, a random number from 0 to 200 is generated per second by counting the number of one bits in the first 200 out of the 10,000 bits. This is the standard procedure used in past experiments like GCP 1.0. Counting across 200 bits helps to normalize and smooth the data. Keeping a gap of 9,800 unused bits between the ones we count ensures the random numbers are not related to each other by being adjacent in time.

There are a number of randomness tests out there, and we perform several of them. Each device is subjected to a rigorous testing process based on more than one million random numbers. Some tests are performed automatically by the system on a regular basis while researchers perform others manually. Following are some examples:

  • A suite of Z score tests that the numbers follow the expected mean and standard deviation as a normal distribution (“bell curve”)
  • Wald-Wolfowitz runs tests that ensure the different lengths of runs (i.e. how many consecutive numbers are all above average or all below) are random
  • Detect and remove outliers due to very occasional blips in the machinery, identified by being many standard deviations from the mean.
  • Measure daily means and standard deviations of each RNG to ensure they are not too many standard deviations from their expected value.

rng.network.technology