Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

A new kind of quantum engine with ultracold atoms

By: VM
7 August 2025 at 15:30

In conventional ‘macroscopic’ engines like the ones that guzzle fossil fuels to power cars and motorcycles, the fuels are set ablaze to release heat, which is converted to mechanical energy and transferred to the vehicle’s moving parts. In order to perform these functions over and over in a continuous manner, the engine cycles through four repeating steps. There are different kinds of cycles depending on the engine’s design and needs. A common example is the Otto cycle, where the engine’s four steps are: 

1. Adiabatic compression: The piston compresses the air-fuel mixture, increasing its pressure and temperature without exchanging heat with the surroundings

2. Constant volume heat addition: At the piston’s top position, a spark plug ignites the fuel-air mixture, rapidly increasing pressure and temperature while the volume remains constant

3. Adiabatic expansion: The high-pressure gas pushes the piston down, doing work on the piston, which powers the engine

4. Constant volume heat rejection: At the bottom of the piston stroke, heat is expelled from the gas at constant volume as the engine prepares to clear the exhaust gases

So the engine goes 1-2-3-4-1-2-3-4 and so on. This is useful. If you plot the pressure and volume of the fuel-air mixture in the engine on two axes of a graph, you’ll see that at the end of the ‘constant volume heat rejection’ step (no. 4), the mixture is in the same state as it is at the start of the adiabatic compression step (no. 1). The work that the engine does on the vehicle is equal to the difference between the work done during the expansion and compression steps. Engines are designed to meet the cyclical requirement while increasing the amount of work it does for a given fuel and vehicle design.

It’s easy to understand the value of machines like this. They’re the reason we have vehicles that we can drive in different ways using our hands, legs, and our senses and in relative comfort. As long as we refill the fuel tank once in a while, engines can repeatedly perform mechanical work using their fuel combustion cycles. It’s understandable then why scientists have been trying to build quantum engines. While conventional engines use classical physics to operate, quantum engines are machines that use the ideas of quantum physics. For now, however, these machines are futuristic because scientists have found that they don’t understand the working principles of quantum engines well enough. University of Kaiserslautern-Landau professor Artur Widera told me the following in September 2023 after he and his team published a paper reporting that they had developed a new kind of quantum engine:

Just observing the development and miniaturisation of engines from macroscopic scales to biological machines and further potentially to single- or few-atom engines, it becomes clear that for few particles close to the quantum regime, thermodynamics as we use in classical life will not be sufficient to understand processes or devices. In fact, quantum thermodynamics is just emerging, and some aspects of how to describe the thermodynamical aspects of quantum processes are even theoretically not fully understood.

This said, recent advances in ultracold atomic physics have allowed physicists to control substances called quantum gases in the so-called low-dimensional regimes, laying the ground for them to realise and study quantum engines. Two recent studies exemplify this progress: the study by Widera et al. in 2023 and a new theoretical study reported in Physical Review E. Both studies have explored engines based on ultracold quantum gases but  have approached the concept of quantum energy conversion from complementary perspectives.

The Physical Review E work investigated a ‘quantum thermochemical engine’ operating with a trapped one-dimensional (1D) Bose gas in the quasicondensate regime as the working fluid — just like the fuel-air mixture in in the internal combustion engine of a petrol-powered car. A Bose gas is a quantum system that consists of subatomic particles called bosons. The ‘1D’ simply means they are limited to moving back and forth on a straight line, i.e. a single spatial dimension. This restriction dramatically changes the bosons’ physical and quantum properties.

According to the paper’s single author, University of Queensland theoretical physicist Vijit Nautiyal, the resulting engine can operate on an Otto cycle where the compression and expansion steps — which dictate the work the engine can do — are implemented by tuning how strongly the bosons interact, instead of changing the volume as in a classical engine. In order to do this, the quantum engine needs to exchange not heat with its surroundings but particles. That is, the particles flow from a hot reservoir to the working boson gas, allowing the engine to perform net work.

Energy enters and leaves the system in the A-B and C-D steps, respectively, when the engine absorbs and releases particles from the hot reservoir. The engine consumes work during adiabatic compression (D-A) and performs work during adiabatic expansion (B-C). The difference between these steps is the engine’s net work output. Credit: arXiv:2411.13041v2

Nautiyal’s study focused on the engine’s performance in two regimes: one where the strength of interaction between bosons was suddenly quenched in order to maximise the engine’s power at the cost of its efficiency, and another where the quantum engine operates at maximum efficiency but produces negligible power. Nautiyal has reported doing this using advanced numerical simulations.

The simulations showed that if the engine only used heat but didn’t absorb particles from the hot reservoir, it couldn’t really produce useful energy at a finite temperatures. This was because of complicated quantum effects and uneven density in the boson gas. But when the engine was allowed to gain or lose particles from/to the reservoir, it got the extra energy it needed to work properly. Surprisingly, this particle exchange allowed the engine operate very efficiently, even when it ran fast. Usually, engines have to choose between going fast and losing efficiency or go slow and being more efficient. The particle exchange allowed Nautiyal’s quantum thermochemical engine avoid that trade-off. Letting more particles flow in and out also made the engine produce more energy and be even more efficient.

Finally, unlike regular engines where higher temperature usually means better efficiency, increasing the temperature of the quantum thermochemical engine too much actually lowered its efficiency, speaking to the important role chemical work played in this engine design.

In contrast, the 2023 experimental study — which I wrote about in The Hindu — realised a quantum engine that, instead of relying on conventional heating and cooling with thermal reservoirs, operated by cycling a gas of particles between two quantum states, a Bose-Einstein condensate and a Fermi gas. The process was driven by adiabatic changes (i.e. changes that happen while keeping the entropy fixed) that converted the fundamental difference in total energy distribution arising from the two states into usable work. The experiment demonstrated that this energy difference, called the Pauli energy, constituted a significant resource for thermodynamic cycles.

The theoretical 2025 paper and the experimental 2023 work are intimately connected as complementary explorations of quantum engine operation using ultracold atomic gases. Both have taken advantage of the unique quantum effects accessible in such systems while focusing on distinct energy resources and operational principles.

The 2025 work emphasised the role of chemical work arising from particle exchange in a one-dimensional Bose gas, exploring the balance of efficiency and power in finite-time quantum thermochemical engines. It also provided detailed computational frameworks to understand and optimise these engines. Likewise, the 2023 experiment physically realised a related but conceptually different mechanism: the movement of lithium atoms between two states and converting their Pauli energy to work. This approach highlighted how the fundamental differences between the two states could be a direct energy source, rather than conventional heat baths, and one operating with little to no production of entropy.

Together, these studies broaden the scope of quantum engines beyond traditional heat-based cycles by demonstrating the usefulness of intrinsically quantum energy forms such as chemical work and Pauli energy. Such microscopic ‘machines’ also herald a new class of engines that harness the fundamental laws of quantum physics to convert energy between different forms more efficiently than the best conventional engines can manage with classical physics.

Physics World asked Nautiyal about the potential applications of his work:

… Nautiyal referred to “quantum steampunk”. This term, which was coined by the physicist Nicole Yunger Halpern at the US National Institute of Standards and Technology and the University of Maryland, encapsulates the idea that as quantum technologies advance, the field of quantum thermodynamics must also advance in order to make such technologies more efficient. A similar principle, Nautiyal explains, applies to smartphones: “The processor can be made more powerful, but the benefits cannot be appreciated without an efficient battery to meet the increased power demands.” Conducting research on quantum engines and quantum thermodynamics is thus a way to optimize quantum technologies.

Quantum clock breaks entropy barrier

By: VM
12 July 2025 at 12:21

In physics, the second law of thermodynamics says that a closed system tends to become more disordered over time. This disorder is captured in an entity called entropy. Many devices, especially clocks, are affected by this law because they need to tick regularly to measure time. But every tick creates a bit of disorder, i.e. increases the entropy, and physicists have believed for a long time now that this places a fundamental limit on how precise a clock can be. The more precise you want your clock, the more entropy (and thus more energy) you’ll have to expend.

A study published in Nature Physics on June 2 challenges this wisdom. In it, researchers from Austria, Malta, and Sweden asked if the second law of thermodynamics really set a limit on a clock’s precision and came away, surprisingly, with a design of a new kind of quantum clock that’s too precise scientists once believed possible for the amount of energy it spends to achieve that precision.

The researchers designed this clock using a spin chain. Imagine a ring made of several quantum sites, like minuscule cups. Each cup can hold an excitation — say, a marble that can hop from cup to cup. This excitation moves around the ring and every time it completes a full circle, the clock ticks once. A spin chain is, broadly speaking, a series of connected quantum systems (the sites) arranged in a ring and the excitation is a subatomic particle or packet of energy that moves from site to site.

In most clocks, every tick is accompanied by the dissipation of some energy and a small increase in entropy. But in the model in the new study, only the last link in the circle, where the last quantum system was linked to the first one, dissipated energy. Everywhere else, the excitation moved without losing energy, like a wave gliding smoothly around the ring. The movement of the excitation in this lossless way through most of the ring is called coherent transport.

The researchers used computer simulations to help them adjust the hopping rates — or how easily the excitation moved between sites — and thus to make the clock as precise as possible. They found that the best setup involved dividing the ring into three regions: (i) in the preparation ramp, the excitation was shaped into a wave packet; (ii) in the bulk propagation phase, the wave packet moved steadily through the ring; and (iii) in the boundary matching phase, the wave packet was reset for the next tick.

The team measured the clock’s precision as the number of ticks it completed before it was one tick ahead or behind a perfect clock. Likewise, team members defined the entropy per tick to be the amount of energy dissipated per tick. Finally, the team compared this quantum clock to classical clocks and other quantum models, which typically show a linear relationship between precision and entropy: e.g. if the precision doubled, the entropy doubled as well.

The researchers, however, found that the precision of their quantum clock grew exponentially with entropy. In other words, if the amount of entropy per tick increased only slightly, the precision increased by a big leap. It was proof that, at least in principle, it’s possible to build a clock to be arbitrarily precise while keeping the system’s entropy down, all without falling afoul of the second law.

That is, contrary to what many physicists thought, the second law of thermodynamics doesn’t strictly limit a clock precision, at least not for quantum clocks like this one. The clock’s design allowed it to sidestep the otherwise usual trade-off between precision and entropy.

During coherent transport, the process is governed only by the system’s Hamiltonian, i.e. the rules for how energy moves in a closed quantum system. In this regime, the excitation acts like a wave that spreads smoothly and reversibly, without losing any energy or creating any disorder. Imagine a ball rolling on a perfectly smooth, frictionless track. It keeps moving without slowing down or heating up the track. Such a thing is impossible in classical mechanics, like in the ball example, but it’s possible in quantum systems. The tradeoff of course is that the latter are very small and very fragile and thus harder to manipulate.

In the present study, the researchers have proved that it’s possible to build a quantum clock that takes advantage of coherent transport to tick while dissipating very little energy. Their model, the spin chain, uses a Hamiltonian that only allows the excitation to coherently hop to its nearest neighbour. The researchers engineered the couplings between the sites in the preparation ramp part of the ring to shape the excitation into a traveling wave packet that moves predominantly in the forward direction.

This tendency to move in only direction is further bolstered at the last link, where the last site is coupled to the first. Here, the researchers installed a thermal gradient — a small temperature difference that encouraged the wave to restart its journey rather than be reflected and move backwards through the ring. When the excitation crossed this thermodynamic bias, the clock ticked once and also dissipated some energy.

Three points here. First, remember that this is a quantum system. The researchers are dealing with energy (almost) at its barest, manipulating it directly without having to bother with an accoutrement of matter covering it. In the classical regime, such accoutrements are unavoidable. For example, if you have a series of cups and you want to make an excitation hop through it, you do so with a marble. But while the marble contains the (potential) energy that you want to move through the cups, it also has mass and it dissipates energy whenever it hops into a cup, e.g. it might bounce when it lands and it will release sound when it strikes the cup’s material. So while the marble metaphor earlier might have helped you visualise the quantum clock, remember that the metaphor has limitations.

Second, for the quantum clock to work as a clock, it needs to break time-reversal symmetry (a concept I recently discussed in the context of quasicrystals). Say you remove the thermodynamic bias at the last link of the ring and replace it with a regular link. In this case the excitation will move randomly — i.e. at each step it will randomly pick the cup to move to, forward or backward, and keep going. If you reversed time, the excitation’s path will still be random and just evolve in reverse.

However, the final thermodynamically biased link causes the excitation to acquire a preference for moving in one direction. The system thus breaks time-reversal symmetry because even if you reverse the flow of time, the system will encourage the excitation to move in one direction and one direction only. This in turn is essential for the quantum system to function like a clock. That is, the excitation needs to traverse a fixed number of cups in the spin chain and then start from the first cup. Only between these two stages will the system count off a ‘tick’. Breaking time-reversal symmetry thus turns the device into a clock.

Three, the thermodynamic bias ensures that the jump from the last site to the first is more likely than the reverse, and the entropy is the cost the system pays in order to ensure the jump. Equally, the greater the thermodynamic bias, the more likely the excitation is to move in one direction through spin chain as well as make the jump in the right direction at the final step. Thus, the greater the thermodynamic bias, the more precise the clock will be.

The new study excelled by creating a sufficiently precise clock while minimising the entropy cost.

According to the researchers, its design design could help build better quantum clocks, which are important for quantum computers, quantum communication, and to make ultra-precise precise measurements of the kind demanded by atomic clocks. The clock’s ticks could also be used to emit single photons at regular intervals — a technology increasingly in demand for its use in quantum networks of the sort China, the US, and India are trying to build.

But more fundamentally, the clock’s design — which confines energy dissipation to a single link and uses coherent transport everywhere else — and that design’s ability to evade the precision-entropy trade-off challenges a longstanding belief that the second law of thermodynamics strictly limits precision.

Featured image credit: Meier, F., Minoguchi, Y., Sundelin, S. et al. Nat. Phys. (2025).

❌
❌