- Chaos theory is meant to study complex systems. A study shows that chaos may be too, well, chaotic for computers to create on their own.
- It's a question of numbers. Computers use factors of 2: 16, 32, 64. Chaos uses more than that.
- The butterfly effect relies on tiny rounding numbers. If computers can't compensate for those, then they're not worth much at creating chaotic systems.

Computers are great at creating order. But a recent study shows that they're poor at creating the exact opposite: chaos. It's beyond the ability of digital computers to reliably reproduce the behavior of chaotic systems, researchers say.

Scientifically speaking, chaotic systems are "dynamical systems that are highly sensitive to initial conditions." In popular culture, that sensitivity is best known as the butterfly effect.

“Our work shows that the behavior of the chaotic dynamical systems is richer than any digital computer can capture," says Peter Coveney, Director of the University College London Center for Computational Science and the study's coauthor, in a press statement. "Chaos is more commonplace than many people may realize and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact. Ultimately, computers can’t simulate everything."

The team used what's known as floating-point arithmetic, which since the 1950s has been a ubiquitous method of approximating real numbers on digital computers. Computers are limited in the types of numbers they can use: rational numbers, which have to have a factor of 2. 2, 4, 8, 16, 32, 64, 128, etc. There are, of course, many more numbers than these.

In the study, the team used all four billion of these single-precision floating-point numbers that range from plus to minus infinity.

“The four billion single-precision floating-point numbers that digital computers use are spread unevenly, so there are as many such numbers between 0.125 and 0.25, as there are between 0.25 and 0.5, as there are between 0.5 and 1.0," says coauthor Bruce Boghosian of Tufts University.

"It is amazing that they are able to simulate real-world chaotic events as well as they do," Boghosian continues. "But even so, we are now aware that this simplification does not accurately represent the complexity of chaotic dynamical systems, and this is a problem for such simulations on all current and future digital computers.”

The team relied on the earliest iterations of the butterfly effect by MIT scientist Edward Lorenz in the 1960s. Studying weather predictions, Lorenz was able to show that even tiny rounding errors in the numbers fed into his computer led to noticeably different forecasts. Even something as tiny as a butterfly flapping its wings could make a change, Lorenz argued.

To study a computer's ability to create chaos, the scientists used a relatively simple chaotic system known as a "generalized Bernoulli map."

“We use the generalized Bernoulli map as a mathematical representation for many other systems that change chaotically over time, such as those seen across physics, biology and chemistry,” says Coveney. “These are being used to predict important scenarios in climate change, in chemical reactions and in nuclear reactors, for example, so it’s imperative that computer-based simulations are now carefully scrutinized.”

The scientists say that more research is needed about how floating-points arithmetic could be damaging computational science. Several scientists suggest that current scientific methods need updating in a number of fields, including mathematics.