REG Experiments: Equipment and Design
About 50 years ago, electronic random event generators (REGs) began to be used in a large number of laboratory experiments designed to test the hypothesis that human consciousness might interact directly with sensitive physical systems. An animated graph using data from a true random source shows the nature of a typical random sequence. Another animation (albeit with no random process) shows how a series of samples gradually build up a distribution. It shows what happens if you let balls bounce through an array of pins in a graphic pinball machine.
The results of many experiments using REGs provide clear statistical evidence that the behavior of these devices deviates from chance expectation in correlation with the pre-defined intentions of participants in the experiments. In 1979, the Princeton Engineering Anomalies Research Laboratory (PEAR) began collecting large databases in a REG experiment with particularly rigorous controls and a variety of optional parameters to assess the reliability and the nature of the apparent mind/machine interaction. Over a 12-year period of primary investigation, ten physical and psychological conditions were examined as possible mediating variables in the experimental results. A number of extensions and variations on the basic protocol have been explored using several random sources as well as a selection of different physical systems, the performance of which is dependent in a fundamental way on some form of random process. A brief summary of the REG results based on an analysis of variance is available.
In 1993, a new protocol was developed to allow experimentation in the field. A portable REG connected to a portable computer allows freedom of movement for field applications. Typically, the device is brought into a group situation to record data continuously in the background while the participants are engaged with each other or the ongoing events. A crucial difference between these FieldREG experiments and the laboratory REG work is that in the former, there is no assigned intention to interact with or influence the device. Instead, the FieldREG has the role of a simple monitor, with the purpose of recording data that will subsequently be examined for deviations that correspond with pre-specified time periods. Both the laboratory and field versions of this research have accompanying calibrations and control data which confirm that the random sources are of high quality, delivering data that conform to theoretical expectations in control conditions.
The PEAR program has used three generations of random event generators, with different primary sources of white noise, but with important common design features. The original
benchmark experiment used a commercial random source developed by Elgenco, Inc., the core of which is proprietary. Elgenco's engineering staff describe the proprietary module as
solid state junctions with precision pre-amplifiers, implying processes that rely on quantum tunneling to produce an unpredictable, broad-spectrum white noise in the form of low-amplitude voltage fluctuations. The PEAR Portable REG uses Johnson noise in resistors, which is so-called
thermal noise and is also a quantum level phenomenon that produces a well-behaved broad-spectrum fluctuation. The Mindsong Micro-REG uses a field effect transistor (FET) for the primary noise source, again relying on quantum tunneling, and providing completely uncorrelated fundamental events that compound to an unpredictable voltage fluctuation.
In all cases the design begins with white noise, for example, in the PEAR Portable REG: a flat spectrum +/- 1 dB from 1100 Hz to 30 kHz. A low-end cutoff at 1000 Hz eliminates frequencies at and below the data-sampling rate. This filtering, together with appropriate amplification and clipping, produces an approximate square wave with unpredictable temporal variation. Sampling at a constant 1 kHz rate is typical, although special sources have been constructed allowing higher rates (up to 2 MHz). Analog and digital processes are completely isolated by alternating these operations to exclude contamination of the analog noise train by digital pulses. To eliminate biases of the mean that might arise from such environmental stresses as temperature change or component aging, an exclusive or (XOR) mask is applied to the digital data stream. This is either an alternating 1/0 pattern or a more complex mask comprising an array of all bytes with equal occurrence of 1/0. Both exclude bias of the mean, in principle, and the latter also excludes all short-lag bit-to-bit and byte-to-byte autocorrelations. Finally, data for the PEAR experiments are recorded as
trials that are the sum of N samples (e. g., 200 bits) from the primary sequence, thus further mitigating any residual short-lag autocorrelations. The result is a data sequence conforming to the appropriate theoretical binomial distribution and to its normal approximation.
The final output of the PEAR devices is a sequence of bytes presented to the computer's serial port, which are then formed into a sequence of trials (typically sums of 200 bits), generated at 1 per second. Calibrations on all of the devices show behavior that closely models theoretical expectations for mean, variance, skew and kurtosis.
REG Experiment Design
Given a sequence of trials with a well-defined expectation for the mean and standard deviation (100, 7.071), participants try to change the output according to pre-stated intentions. The situation is analogous to trying to get more
heads or more
tails while flipping an unbiased coin. The REG is in this sense a very sophisticated, high speed electronic
coin-flipper, connected to a computer for reliable data collection in controlled experiments. The computer also allows immediate computation of statistics and feedback of various kinds, including graphic displays of the accumulating deviations from what is expected for an undisturbed random process.
The basic design for laboratory experiments using the REG technology constitutes a final level of protection against artifactual sources of apparent effect. It is a "tripolar" design, where participants generate data under three conditions of pre-specified intention, namely to achieve high (HI) or low (LO) means, or to generate baseline (BL) data. In addition to this primary variable, a number of secondary parameters are represented as options that can be explored. These include the identity of the individual operators (participants), including robust comparisons that are possible among a subset of prolific operators who do many replications of the experiment. A related, simpler variable is the operator's gender, including operator pairs who may be of the same or opposite sex, and who may be
Different sources for the data include the true random sources described earlier, and both hardware and algorithmic pseudorandom generators. Other parameters include the distance from the operator to the machine, up to thousands of miles, and analogous separations in time, up to several hours or a few days. The information density (bits per second) and the number of trials in runs have been varied, as have the instruction mode, feedback type, and the replication number or serial position. There are a number of publications giving details.