‘Momentum Calculation’ Pushes the Thermodynamic Limits of Technology News and Research


In case you haven’t noticed, computers are really hot. While data centers consume an estimated 200 terawatt-hours each year, compared to the energy consumption of some mid-sized countries, a laptop can pump out nauseating heat. The carbon footprint of information and communication technologies as a whole is close to fuel use in the aviation industry. And as the computer circuit gets smaller and more densely packed, it becomes more susceptible to melting from the energy it emits as heat.

Now, physicist James Crutchfield of the University of California, Davis, and his graduate student, Kyle Ray, have proposed a new way to do calculations. dissipate only a small part of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint article, can bring heat dissipation even below the theoretical minimum that the laws of physics impose on today’s computers. This can greatly reduce the energy required both to perform calculations and to keep circuits cool. All of this can be done, the researchers say, using microelectronic devices that already exist.

In 1961, physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, NY, showed that conventional computation incurs an unavoidable cost in the distribution of energy—basically, the generation of heat and entropy. This is because a traditional computer sometimes needs to erase bits of information from its memory circuits to make room for more. Each time a single bit (with a value of 1 or 0) is reset, a certain minimum amount of energy is expended, which Ray and Crutchfield called “Landauer”. Its value depends on the ambient temperature: a Landauer will be around 10 in your living room.–21 joules (For comparison, a lit candle emits an energy order of 10 joules per second.)

Computer scientists have long realized that Landauer’s limit on how little heat a computation will generate can be reduced by the following. immortality delete any information. A calculation done this way is completely reversible because discarding any information means every step can be undone. This process may sound like it will quickly fill up a computer’s memory. But in the 1970s, Charles Bennett showed, again at TJ Watson, that instead of discarding information at the end of computation, it could be tuned to “decompute” intermediate results that are no longer needed by reversing its logical steps and turning the computer back on. for him original condition.

The problem is that in order to avoid any heat transfer—that is, to be what physicists call the adiabatic process—the logical sequence of operations in computation often has to be performed infinitely slowly. This approach, in a way, avoids any “frictional heating” in the process, but at the expense of taking an infinitely long time to complete the computation.

So it doesn’t seem like a very practical solution. “For a long time, conventional wisdom was that in reversible computing, energy dissipation is proportional to velocity,” says computer scientist Michael Frank of Sandia National Laboratories in Albuquerque, NM.

To the Border – And Beyond

Silicon-based computing isn’t approaching the Landauer limit anyway: currently this type of computation generates about a few thousand Landauers in heat per logical operation, and it’s hard to see how even some super-efficient silicon chips of the future can go below 100. . But Ray and Crutchfield say it’s possible to do better by encoding information in electric currents in a new way: not as charge pulses, but in the momentum of moving particles. They say this will allow the computation to be done reversibly without sacrificing speed.

Two researchers and their colleagues came up with the basic idea of ​​momentum computation last year. The key concept is that the momentum of a bit-encoding particle can provide a kind of “free” memory because it carries information about the particle’s past and future motion, not just its instantaneous state. “Previously, information was stored spatially: ‘Where is the particle?’” Crutchfield says. For example, a given electron How are you today? channel or he is one? “Momentum calculation uses information from location. and speed,” he says.

This extra information can then be used for reversible computing. For the idea to work, the logical operations would have to happen much faster than the time it would take for the bit to come into thermal equilibrium with its surroundings, which would randomize the bit’s movement and mess up the information. In other words, “calculating momentum” requirements Crutchfield says the device is running at high speed. You have to “calculate fast” for it to work – that is, non-adiabatic.

The researchers considered how to use the idea to implement a logical operation called bit swapping. where two bits simultaneously translate their values: 1 becomes 0 and vice versa. No information is thrown away here; it’s just reconfigured, so in theory it costs nothing to delete.

Yet if the information is only encoded at a particle’s position, some trade-off – for example, switching particles between left channel and right channel – means that their identities are confused and therefore indistinguishable from “before” and “before”. after” are included. But if the particles have opposite momentum, they stay different, so the process creates a real and reversible change.

A Practical Device

Ray and Crutchfield explained how this idea could be applied in a practical device, particularly superconducting flux quantum bits, or qubits, which are the standard bits used in most of today’s quantum computers. “We are becoming parasites of the quantum computing community!” Crutchfield cheerfully agrees. These devices consist of rings of superconducting material interrupted by structures called Josephson junctions (JJs), in which a thin layer of non-superconducting material is placed between two superconductors.

Information in JJ circuits is often encoded in what is called supercurrent circulation, which can be changed using microwave radiation. However, because supercurrents carry momentum, they can also be used for momentum calculation. Ray and Crutchfield performed simulations that suggested that under certain conditions JJ circuits should be able to support momentum computation approaches. If cooled to liquid helium temperatures, the circuit can perform a single bit-switching operation in less than 15 nanoseconds.

“While our proposal is based on a specific substrate to be as concrete as possible and accurately estimate the energies required,” Crutchfield says, “the recommendation is much more general than that.” In principle, it should work with normal (albeit cryogenically cooled) electronic circuits, or even small, carefully insulated mechanical devices that can carry momentum (and thus do computation) in their moving parts. However, Crutchfield says an approach with superconducting bits may be particularly suitable because it’s “known microtechnology that is known to scale very well.”

Crutchfield should know: Working with Michael Roukes and his collaborators at the California Institute of Technology, Crutchfield had previously measured the cost of erasing a lice on a JJ device and showed that it was close to the Landauer limit. In the 1980s, Crutchfield and Roukes even served as consultants for IBM’s initiative to build a reversible JJ computer; this was eventually abandoned due to the extremely demanding production requirements at the time.

Follow the Bouncing Ball

Leveraging the speed of a particle is not a completely new idea for the computer. Momentum computation is very similar to a reversible computational concept called ballistic computation proposed in the 1980s: in it, information is encoded in objects or particles that move freely along circuits under their own inertia and carry with them some signals used. repeatedly to perform many logical operations. If the particle interacts elastically with others, it does not lose any energy in the process. In such a device, after the ballistic bits are ‘thrown’, they run the computation on their own, with no other energy input. The calculation is reversible as long as the bits keep bouncing along their trajectory. Information is only erased and energy is dissipated only when their state is read.

In ballistic computing, a particle’s velocity allows the particle to carry information from input to output, simply moving it across the device, while in momentum computation, Crutchfield says, a particle’s velocity and position collectively allow it to embody a unique and unambiguous sequence. states during a calculation. This last case is key to reversibility, and therefore low emissivity, he adds, because it can reveal exactly where each particle is.

Researchers, including Frank has worked on ballistic reversible computing for decades. One difficulty is that ballistic computation in its first proposal is dynamically unstable because, for example, particle collisions can be chaotic and therefore highly sensitive to the smallest random fluctuations: they cannot be reversed afterwards. But researchers have made progress in resolving the problems. soon preprint paperKevin Osborn and Waltraut Wustmann, both of the University of Maryland, suggested that JJ circuits could be used to make an invertible ballistic logic circuit called a shift register, in which the output of one logic gate is the input of the next. “flip-flop” operations.

“Superconductor circuits are a good platform for testing reversible circuits,” says Osborn. He adds that the JJ circuits appear very close to those stipulated by Ray and Crutchfield, so they may be the best candidates to test their ideas.

“I would say all of our groups work with an intuition that these methods can strike a better balance between efficiency and speed than traditional approaches to reversible computing,” says Frank. Ray and Crutchfield “have probably done the most extensive work ever to demonstrate this at the level of theory and simulation of individual devices.” Even so, Frank warns, all the various approaches to ballistics and momentum calculation are “still far from being a practical technology.”

Crutchfield is more optimistic. “It really depends on people supporting acceleration,” he says. He thinks small, low-dissipation momentum computational JJ circuits could be possible in a few years, with full microprocessors becoming available this decade. As a result, he estimates that consumer-level momentum computing can realize energy efficiency gains of 1000 times or more over current approaches. “Think [if] Your Google server farm in a giant warehouse using 1,000 kilowatts for computing and cooling [was instead] reduced to just one kilowatt – the equivalent of a few incandescent light bulbs,” says Crutchfield.

But Crutchfield says the benefits of the new approach may be wider than a practical reduction in energy costs. “Momentum computing will lead to a conceptual shift in how we see information processing in the world, including how information is processed in biological systems,” he says.

Leave a Comment

Your email address will not be published.