Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Is the Higgs boson doing its job?

By: VM
12 September 2025 at 13:19
Is the Higgs boson doing its job?

At the heart of particle physics lies the Standard Model, a theory that has stood for nearly half a century as the best description of the subatomic realm. It tells us what particles exist, how they interact, and why the universe is stable at the smallest scales. The Standard Model has correctly predicted the outcomes of several experiments testing the limits of particle physics. Even then, however, physicists know that it's incomplete: it can't explain dark matter, why matter dominates over antimatter, and why the force of gravity is so weak compared to the other forces. To settle these mysteries, physicists have been conducting very detailed tests of the Model, each of which has either tightened their confidence in a hypothetical explanation or has revealed a new piece of the puzzle.

A central character in this story is a subatomic particle called the W boson — the carrier of the weak nuclear force. Without it, the Sun wouldn't shine because particle interactions involving the weak force are necessary for nuclear fusion to proceed. W bosons are also unusual among force carriers: unlike photons (the particles of light), they're massive, about 80-times heavier than a proton. This mass difference — of a massless photon and a massive W boson — arises due to a process called the Higgs mechanism. Physicists first proposed this mechanism in 1964 and confirmed it was real when they found the Higgs boson particle at the Large Hadron Collider (LHC) in 2012.

Is the Higgs boson doing its job?
The particles of the Standard Model of particle physics. The W bosons are shown among the force-carrier particles on the right. The photon is denoted γ. The electron (e) and muon (µ) are shown among the leptons on the right. The corresponding neutrino flavours are showing on the bottom row, denoted ν. Credit: Daniel Dominguez/CERN

But finding the Higgs particle was only the beginning. To prove that the Higgs mechanism really works the way the theory says, physicists need to check its predictions in detail. One of the sharpest tests involves how W bosons scatter off each other at high energies. Both photons and W bosons have a property called quantum spin, but whereas for photons its value is zero, for W bosons its non-zero. The spin also has a direction. If it points sideways, the W boson is said to be transverse polarised; if it's pointing along the particle's direction of travel, the W boson is said to be longitudinally polarised. The longitudinal ones are special because their behaviour is directly tied to the Higgs mechanism.

Specifically, if the Higgs mechanism and the Higgs boson don't exist, calculations involving the longitudinal W bosons scattering off of each other quickly give rise to nonsensical mathematical results in the theory. The Higgs boson acts like a regulator in this engine, preventing the mathematics from 'blowing up'. In fact, in the 1970s, the theoretical physicists Benjamin Lee, Chris Quigg, and Hugh Thacker showed that without the Higgs boson, the weak force would become uncontrollably powerful at high energies, leading to the breakdown of the theory. Their work was an important theoretical pillar that justified building the colossal LHC machine to search for the Higgs boson particle.

Technical foundation for a muon collider laid at J-PARC
A particle collider is a machine that energises two beams of subatomic particles and smashes them head on. The Large Hadron Collider (LHC) in Europe is the world’s largest and most famous particle collider. It accelerates (with the effect of energising) two beams of protons to nearly the speed of
Is the Higgs boson doing its job?XorlandVM
Is the Higgs boson doing its job?

The terms Higgs boson, Higgs field, and Higgs mechanism describe related but distinct ideas. The Higgs field is a kind of invisible medium thought to fill all of space. Particles like W bosons and Z bosons interact with this field as they move and through that interaction they acquire mass. This is the Higgs mechanism: the process by which particles that would otherwise be massless become heavy.

The Higgs boson is different: it's a particle that represents a vibration or a ripple in the Higgs field, just as a photon is a ripple in the electromagnetic field. Its discovery in 2012 confirmed that the field is real and not just something that appears in the mathematics of the theory. But discovery alone doesn't prove the mechanism is doing everything the theory demands. To test that, physicists need to look at situations where the Higgs boson's balancing role is crucial.

The scattering of longitudinally polarised W bosons is a good example. Without the Higgs boson, the probabilities of the scatterings occurring uncontrollably at higher energy, but with the Higgs boson in the picture, they stay within sensible bounds. Observing longitudinally polarised W bosons behaving as predicted is thus evidence for the particle as well as a check on the field and the mechanism behind it.

Imagine a roller-coaster without brakes. As it goes faster and faster, there's nothing to stop it from flying off the tracks. The Higgs mechanism is like the braking system that keeps the ride safe. Observing longitudinally polarised W bosons in the right proportions is equivalent to checking that the brakes actually work when the roller-coaster speeds up.

Is the Higgs boson doing its job?
Credit: Skyler Gerald

Another path that physicists once considered and that didn't involve a Higgs boson at all was called technicolor theory. Instead of a single kind of Higgs boson giving the W bosons their mass, technicolor proposed a brand-new force. Just as the strong nuclear force binds quarks into protons and neutrons, the hypothetical technicolor force would bind new "technifermion" particles into composite states. These bound states would mimic the Higgs boson's job of giving particles mass, while producing their own new signals in high-energy collisions.

The crucial test to check whether some given signals are due to the Higgs boson or due to technicolor lies in the behaviour of longitudinally polarised W bosons. In the Standard Model, their scattering is kept under control by the Higgs boson's balancing act. In technicolor, by contrast, there is no Higgs boson to cancel the runaway growth. The probability of the scattering of longitudinally polarised W bosons would therefore rise sharply with more energy, often leaving clearly excessive signals in the data.

Thus, observing longitudinally polarised W bosons at levels consistent with the predictions of the Standard Model, and not finding any additional signals, would also strengthen the case for the Higgs mechanism and weaken that for technicolor and other "Higgs-less" theories.

At the Large Hadron Collider, the cleanest way to study look for such W bosons is in a phenomenon called vector boson scattering (VBS). In VBS, two protons collide and the quarks inside them emit W bosons. These W bosons then scatter off each other before decaying into lighter particles. The leftover quarks form narrow sprays of particles, or 'jets', that fly far forward.

If the two W bosons happen to have the same electric charge — i.e. both positive or both negative — the process is even more distinctive. This same-sign WW scattering is quite rare and that's an advantage because then it's easy to spot in the debris of particle collisions.

Both ATLAS and CMS, the two giant detectors at the LHC, had previously observed same-sign WW scattering without breaking down the polarisation. In 2021, the CMS detector reported the first hint of longitudinal polarisation but at a statistical significance only of 2.3 sigma, which isn't good enough (particle physicists prefer at least 3 sigma). So after the LHC completed its second run in 2018, collecting data from around 10 quadrillion collisions between protons, the ATLAS collaboration set out to analyse it and deliver the evidence. This group's study was published in Physical Review Letters on September 10.

Is the Higgs boson doing its job?
The layout of the Large Hadron Collider complex at CERN. Protons (p) are pre-accelerated to higher energies in steps — at the Proton Synchrotron (PS) and then the Super Proton Synchrotron (SPS) — before being injected into the the LHC ring. The machine then draws two opposing beams of protons from the SPS and accelerates them to nearly the speed of light before colliding them head-on at four locations, under the gaze of the four detectors. ATLAS and CMS are two of them. Credit: Arpad Horvath (CC BY-SA)

The challenge of finding longitudinally polarised W bosons is like finding a particular needle in a very large haystack where most of the needles look nearly identical. So ATLAS designed a special strategy.

When one W boson decays, the result is one electron or muon and one neutrino. If the W boson is positively charged, for example, the decay could be to one anti-electron and one electron-neutrino or to one anti-muon and a muon-neutrino. Anti-electrons and anti-muons are positively charged. If the W boson is negatively charged, the products could be one electron and one electron-antineutrino or one muon and one muon-antineutrino. So first, ATLAS zeroed in on the fact that it was looking for two electrons, two muons or one of each, both carrying the same electric charge. Neutrinos however are really hard to catch and study, so the ATLAS group looked for their absence rather than their presence. In all these particle interactions, the law of conservation of momentum holds — which means in a given interaction, a neutrino's presence can be elucidated when the momenta of the electrons or muons add up to be slightly lower than that of the W boson. The missing amount would have been carried away by the neutrino, like money unaccounted for in a ledger.

This analysis also required an event of interest to have at least two jets (reconstructed from streams of particles) with a combined energy above 500 GeV and separated widely in rapidity (which is a measure of their angle relative to the beam). This particular VBS pattern — two electrons/muons, two jets, and missing momentum — is the hallmark of same-sign WW scattering.

Second, even with these strict requirements, impostors can creep in. The biggest source of confusion was WZ production, a process in which another subatomic particle called the Z boson decays invisibly or one of its decay products goes unnoticed, making the event resemble WW scattering. Other sources include electrons having their charges mismeasured, jets can masquerading as electrons/muons, and some quarks producing electrons/muons that slip into the sample. To control for all this noise, the ATLAS group focused on control regions: subsets of events that produced a distinct kind of noise that the group could cleanly 'subtract' from the data to reveal same-sign WW scattering, thus also reducing uncertainty in the final results.

Third, and this is where things get nuanced: the differences between transverse and longitudinally polarised W bosons show up in distributions — i.e. how far apart the electrons/muons are in angle, how the jets are oriented, and the energy of the system. But since no single variable could tell the whole story, the ATLAS group combined them using deep neural networks. These machine-learning models were fed up to 20 kinematic variables — including jet separations, particle angles, and missing momentum patterns — and trained to distinguish between three groups:

(i) Two transverse polarised W bosons;

(ii) One transverse polarised W boson and one longitudinally polarised W boson; and

(iii) Both longitudinally polarised W bosons

Fourth, the group combined the outputs of these neural networks and fit them with a maximum likelihood method. When physicists make measurements, they often don't directly see what they're measuring. Instead, they see data points that could have come from different possible scenarios. A likelihood is a number that tells them how probable the data is in a given scenario. If a model says "events should look like this," they can ask: "Given my actual data, how likely is that?" And the maximum likelihood method will help them decide the parameters that make the given data most likely to occur.

For example, say you toss a coin 100 times and get 62 heads. You wonder: is the coin fair or biased? If it's fair, the chance of exactly 62 heads is small. If the coin is slightly biased (heads with probability 0.62), the chance of 62 heads is higher. The maximum likelihood estimate is to pick the bias, or probability of heads, that makes your actual result most probable. So here the method would say, "The coin's bias is 0.62" — because this choice maximises the likelihood of seeing 62 heads out of 100.

In their analysis, the ATLAS group used the maximum likelihood method to check whether the LHC data 'preferred' a contribution from longitudinal scattering, after subtracting what background noise and transverse-only scattering could explain.

The results may be a milestone in experimental particle physics. In the September 10 paper, ATLAS reported evidence for longitudinally polarised W bosons in same-sign WW scattering with a significance of 3.3 sigma — sufficiently close to 4, which is the calculated significance based on the predictions of the Standard Model. This means the data behaved as theory predicted, with no unexpected excess or deficit.

It's also bad news for technicolor theory. By observing longitudinal W bosons at exactly the rates predicted by the Standard Model, and not finding any additional signals, the ATLAS data strengthens the case for the Higgs mechanism providing the check on the W bosons' scattering probability, rather than the technicolor force.

The measured cross-section for events with at least one longitudinally polarised W boson was 0.88 femtobarns, with an uncertainty of 0.3 femtobarns. These figures essentially mean that there were only a few hundred same-sign WW scattering events in the full dataset of around 10 quadrillion proton-proton collisions. The fact that ATLAS could pull this signal out of such a background-heavy environment is a testament to the power of modern machine learning working with advanced statistical methods.

The group was also able to quantify the composition of signals. Among others:

  1. About 58% of events were genuine WW scattering
  2. Roughly 16% were from WZ production
  3. Around 18% arose from irrelevant electrons/muons, charge misidentification or the decay of energetic photons

One way to appreciate the importance of these findings is by analogy: imagine trying to hear a faint melody being played by a single violin in the middle of a roaring orchestra. The violin is the longitudinal signal; the orchestra is the flood of background noise. The neural networks are like sophisticated microphones and filters, tuned to pick out the violin's specific tone. The fact that ATLAS couldn't only hear it but also measured its volume to match the score written by the Standard Model is remarkable.

These results are more than just another tick mark for the Standard Model. They're a direct test of the Higgs mechanism in action. The discovery of the Higgs boson particle in 2012 was groundbreaking but proving that the Higgs mechanism performs its theoretical role requires demonstrating that it regulates the scattering of W bosons. By finding evidence for longitudinally polarised W bosons at the expected rate, ATLAS has done just that.

The results also set the stage for the future. The LHC is currently being upgraded to a form called the High-Luminosity LHC and it will begin operating later this decade, collecting datasets about 10x larger than what the LHC did in its second run. With that much more data, physicists will be able to study differential distributions, i.e. how the rate of longitudinal scattering varies with energy, angle or jet separation. These patterns are sensitive to hitherto unknown particles and forces, such as additional Higgs-like particles or modifications to the Higgs mechanism itself. And even small deviations from the Standard Model's predictions could hint at new frontiers in particle physics.

Indeed, history has often reminded physicists that such precision studies often uncover surprises. For example physicists didn't discover neutrino oscillations by finding a new particle but by noticing that the number of neutrinos arriving from the Sun at detectors on Earth didn't match expectations. Similarly, minuscule mismatches between theory and observations in the scattering of W bosons could someday reveal new physics — and if they do, the seeds will have been planted by studies like that of the ATLAS group.

Challenging the neutrino signal anomaly
A gentle reminder before we begin: you’re allowed to be interested in particle physics. 😉 Neutrinos are among the most mysterious particles in physics. They are extremely light, electrically neutral, and interact so weakly with matter that trillions of them pass through your body each second without leaving a trace. They
Is the Higgs boson doing its job?XorlandVM
Is the Higgs boson doing its job?

On the methodological front, the analysis also showcases how particle physics is evolving. 'Classical' analyses once banked on tracking single variables; now, deep learning has played a starring role by combining many variables into a single discriminant, allowing ATLAS to pull the faint signal of longitudinally polarised W bosons from the noise. This approach could only become more important as both datasets and physicists' ambitions expand.

Perhaps the broadest lesson in all this is that science often advances by the unglamorous task of verifying the details. The discovery of the Higgs boson answered one question but opened many others; among them, measuring how it affects the scattering of W bosons is one of the more direct ways to probe whether the Standard Model is complete or just the first chapter of a longer story. Either way, the pursuit exemplifies the spirit of checking, rechecking, testing, and probing until scientists truly understand how nature works at extreme precision.

A transistor for heat

By: VM
25 August 2025 at 11:49
A transistor for heat

Quantum technologies and the prospect of advanced, next-generation electronic devices have been maturing at an increasingly rapid pace. Both research groups and governments around the world are investing more attention in this domain.

India for example mooted its National Quantum Mission in 2023 with a decade-long outlay of Rs 6,000 crore. One of the Mission’s goals, in the words of IISER Pune physics professor Umakant Rapol, is “to engineer and utilise the delicate quantum features of photons and subatomic particles to build advanced sensors” for applications in “healthcare, security, and environmental monitoring”.

On the science front, as these technologies become better understood, scientists have been paying increasingly more attention to managing and controlling heat in them. These technologies often rely on quantum physical phenomena that appear only at extremely low temperatures and are so fragile that even a small amount of stray heat can destabilise them. In these settings, scientists have found that traditional methods of handling heat — mainly by controlling the vibrations of atoms in the devices’ materials — become ineffective.

Instead, scientists have identified a promising alternative: energy transfer through photons, the particles of light. And in this paradigm, instead of simply moving heat from one place to another, scientists have been trying to control and amplify it, much like how transistors and amplifiers handle electrical signals in everyday electronics.

Playing with fire

Central to this effort is the concept of a thermal transistor. This device resembles an electrical transistor but works with heat instead of electrical current. Electrical transistors amplify or switch currents, allowing the complex logic and computation required to power modern computers. Creating similar thermal devices would represent a major advance, especially for technologies that require very precise temperature control. This is particularly true in the sub-kelvin temperature range where many quantum processors and sensors operate.

A transistor for heat
This circuit diagram depicts an NPN bipolar transistor. When a small voltage is applied between the base and emitter, electrons are injected from the emitter into the base, most of which then sweep across into the collector. The end result is a large current flowing through the collector, controlled by the much smaller current flowing through the base. Credit: Michael9422 (CC BY-SA)

Energy transport at such cryogenic temperatures differs significantly from normal conditions. Below roughly 1 kelvin, atomic vibrations no longer carry most of the heat. Instead, electromagnetic fluctuations — ripples of energy carried by photons — dominate the conduction of heat. Scientists channel these photons through specially designed, lossless wires made of superconducting materials. They keep these wires below their superconducting critical temperatures, allowing only photons to transfer energy between the reservoirs. This arrangement enables careful and precise control of heat flow.

One crucial phenomenon that allows scientists to manipulate heat in this way is negative differential thermal conductance (NDTC). NDTC defies common intuition. Normally, decreasing the temperature difference between two bodies reduces the amount of heat they exchange. This is why a glass of water at 50º C in a room at 25º C will cool faster than a glass of water at 30º C. In NDTC, however, reducing the temperature difference between two connected reservoirs can actually increase the heat flow between them.

NDTC arises from a detailed relationship between temperature and the properties of the material that makes up the reservoirs. When physicists harness NDTC, they can amplify heat signals in a manner similar to how negative electrical resistance powers electrical amplifiers.

A ‘circuit’ for heat

In a new study, researchers from Italy have designed and theoretically modelled a new kind of ‘thermal transistor’ that they have said can actively control and amplify how heat flows at extremely low temperatures for quantum technology applications. Their findings were published recently in the journal Physical Review Applied.

To explore NDTC experimentally, the researchers studied reservoirs made of a disordered semiconductor material that exhibited a transport mechanism called variable range hopping (VRH). An example is neutron-transmutation-doped germanium. In VRH materials, the electrical resistance at low temperatures depends very strongly, sometimes exponentially, on temperature.

This attribute makes them ideal to tune their impedance, a property that controls the material’s resistance to energy flow, simply by adjusting temperature. That is, how well two reservoirs made of VRH materials exchange heat can be controlled by tuning the impedance of the materials, which in turn can be controlled by tuning their temperature.

In the new study, the researchers reported that impedance matching played a key role. When the reservoirs’ impedances matched perfectly (when their temperatures became equal), the efficiency with which they transferred photonic heat reached a peak. As the materials’ temperatures diverged, heat flow dropped. In fact, the researchers wrote that there was a temperature range, especially as the colder reservoir’s temperature rose to approach that of the warmer one, within which the heat flow increased even as the temperature difference shrank. This effect forms the core of NDTC.

The research team, associated with the NEST initiative at the Istituto Nanoscienze-CNR and Scuola Normale Superiore, both in Pisa in Italy, have proposed a device they call the photonic heat amplifier. They built it using two VRH reservoirs connected by superconducting, lossless wires. One reservoir was kept at a higher temperature and served as the source of heat energy. The other reservoir, called the central island, received heat by exchanging photons with the warmer reservoir.

A transistor for heat
The proposed device features a central island at temperature T1 that transfers heat currents to various terminals. The tunnel contacts to the drain and gate are positioned at heavily doped regions of the yellow central island, highlighted by a grey etched pattern. Each arrow indicates the positive direction of the heat flux. The substrate is (shown as and) maintained at temperature Tb, the gate at Tg, and the drain at Td. Credit: arXiv:2502.04250v3

The central island was also connected to two additional metallic reservoirs named the “gate” and the “drain”. These points operated with the same purpose as the control and output terminals in an electrical transistor. The drain stayed cold, allowing the amplified heat signal to exit the system from this point. By adjusting the gate temperature, the team could modulate and even amplify the flow of heat between the source and the drain (see image below).

To understand and predict the amplifier’s behaviour, the researchers developed mathematical models for all forms of heat transfer within the device. These included photonic currents between VRH reservoirs, electron tunnelling through the gate and drain contacts, and energy lost as vibrations through the device's substrate.

(Tunnelling is a quantum mechanical phenomenon where an electron has a small chance of floating through a thin barrier instead of going around it.)

Raring to go

By carefully selecting the device parameters — including the characteristic temperature of the VRH material, the source temperature, resistances at the gate and drain contacts, the volume of the central island, and geometric factors — the researchers said they could tailor the device for different amplification purposes.

They reported two main operating modes. The first was called ‘current modulation amplifier’. In this configuration, the device amplified small variations in thermal input at the gate. In this mode, small oscillations in the gate heat current produced much larger oscillations, up to 15-times greater, in the photon current between the source and the central island and in the drain current, according to the paper. This amplification was efficient down to 20 millikelvin, matching the ultracold conditions required in quantum technologies. The output range of heat current was similarly broad, showing the device’s suitability to amplify heat signals.

The second mode was called ‘temperature modulation amplifier’. Here, slight changes of only a few millikelvin in the gate temperature, the team wrote, caused the output temperature in the central island to swing by as large as 3.3 times the changes in the input. The device could also handle input temperature ranges over 100 millikelvin. This performance reportedly matched or surpassed other temperature amplifiers already reported in the scientific literature. The researchers also noted that this mode could be used to pre-amplify signals in bolometric detectors used in astronomy telescopes.

An important ability relevant for practical use is the relaxation time, i.e. how soon after operating once the device returned to its original state, ready for the next run. The amplifier in both configurations showed relaxation times between microseconds and milliseconds. According to the researchers, this speed resulted from the device’s low thermal mass and efficient heat channels. Such a fast response could make it suitable to detect and amplify thermal signals in real time.

The researchers wrote that the amplifier also maintained good linearity and low distortion across various inputs. In other words, the output heat signal changed proportionally to the input heat signal and the device didn’t add unwanted changes, noise or artifacts to the input signal. Its noise-equivalent power values were also found to rival the best available solid-state thermometers, indicating low noise levels.

Approaching the limits

For these promising results, realising this device involves some significant practical challenges. For instance, NDTC depends heavily on precise impedance matching. Real materials inevitably have imperfections, including those due to imperfect fabrication and environmental fluctuations. Such deviations could lower the device’s heat transfer efficiency and reduce the operational range of NDTC.

The system also banked on lossless superconducting wires being kept well below their critical temperatures. Achieving and maintaining these ultralow temperatures requires sophisticated and expensive refrigeration infrastructure, which adds to the experimental complexity.

Fabrication also demands very precise doping and finely tuned resistances for the gate and drain terminals. Scaling production to create many devices or arrays poses major technical difficulties. Integrating numerous photonic heat amplifiers into larger thermal circuits risks unwanted thermal crosstalk and signal degradation, a risk compounded by the extremely small heat currents involved.

Furthermore, the fully photonic design offers benefits such as electrical isolation and long-distance thermal connections. However, it also approaches fundamental physical limits. Thermal conductance caps the maximum possible heat flow through photonic channels. This limitation could restrict how much power the device is able to handle in some applications.

Then again, many of these challenges are typical of cutting-edge research in quantum devices, and highlight the need for detailed experimental work to realise and integrate photonic heat amplifiers into operational quantum systems.

If they are successfully realised for practical applications, photonic heat amplifiers could transform how scientists manage heat in quantum computing and nanotechnologies that operate near absolute zero. They could pave the way for on-chip heat control, computers to autonomously stabilise the temperature, and perform thermal logic operations. Redirecting or harvesting waste heat could also improve the efficiency and significantly reduce noise — a critical barrier in ultra-sensitive quantum devices like quantum computers.

Featured image credit: Lucas K./Unsplash.

A transistor for heat

By: VM
25 August 2025 at 11:49

Quantum technologies and the prospect of advanced, next-generation electronic devices have been maturing at an increasingly rapid pace. Both research groups and governments around the world are investing more attention in this domain.

India for example mooted its National Quantum Mission in 2023 with a decade-long outlay of Rs 6,000 crore. One of the Mission’s goals, in the words of IISER Pune physics professor Umakant Rapol, is “to engineer and utilise the delicate quantum features of photons and subatomic particles to build advanced sensors” for applications in “healthcare, security, and environmental monitoring”.

On the science front, as these technologies become better understood, scientists have been paying increasingly more attention to managing and controlling heat in them. These technologies often rely on quantum physical phenomena that appear only at extremely low temperatures and are so fragile that even a small amount of stray heat can destabilise them. In these settings, scientists have found that traditional methods of handling heat — mainly by controlling the vibrations of atoms in the devices’ materials — become ineffective.

Instead, scientists have identified a promising alternative: energy transfer through photons, the particles of light. And in this paradigm, instead of simply moving heat from one place to another, scientists have been trying to control and amplify it, much like how transistors and amplifiers handle electrical signals in everyday electronics.

Playing with fire

Central to this effort is the concept of a thermal transistor. This device resembles an electrical transistor but works with heat instead of electrical current. Electrical transistors amplify or switch currents, allowing the complex logic and computation required to power modern computers. Creating similar thermal devices would represent a major advance, especially for technologies that require very precise temperature control. This is particularly true in the sub-kelvin temperature range where many quantum processors and sensors operate.

Transistor Simple Circuit Diagram with NPN Labels.svg.
This circuit diagram depicts an NPN bipolar transistor. When a small voltage is applied between the base and emitter, electrons are injected from the emitter into the base, most of which then sweep across into the collector. The end result is a large current flowing through the collector, controlled by the much smaller current flowing through the base. Credit: Michael9422 (CC BY-SA)

Energy transport at such cryogenic temperatures differs significantly from normal conditions. Below roughly 1 kelvin, atomic vibrations no longer carry most of the heat. Instead, electromagnetic fluctuations — ripples of energy carried by photons — dominate the conduction of heat. Scientists channel these photons through specially designed, lossless wires made of superconducting materials. They keep these wires below their superconducting critical temperatures, allowing only photons to transfer energy between the reservoirs. This arrangement enables careful and precise control of heat flow.

One crucial phenomenon that allows scientists to manipulate heat in this way is negative differential thermal conductance (NDTC). NDTC defies common intuition. Normally, decreasing the temperature difference between two bodies reduces the amount of heat they exchange. This is why a glass of water at 50º C in a room at 25º C will cool faster than a glass of water at 30º C. In NDTC, however, reducing the temperature difference between two connected reservoirs can actually increase the heat flow between them.

NDTC arises from a detailed relationship between temperature and the properties of the material that makes up the reservoirs. When physicists harness NDTC, they can amplify heat signals in a manner similar to how negative electrical resistance powers electrical amplifiers.

A ‘circuit’ for heat

In a new study, researchers from Italy have designed and theoretically modelled a new kind of ‘thermal transistor’ that they have said can actively control and amplify how heat flows at extremely low temperatures for quantum technology applications. Their findings were published recently in the journal Physical Review Applied.

To explore NDTC experimentally, the researchers studied reservoirs made of a disordered semiconductor material that exhibited a transport mechanism called variable range hopping (VRH). An example is neutron-transmutation-doped germanium. In VRH materials, the electrical resistance at low temperatures depends very strongly, sometimes exponentially, on temperature.

This attribute makes them ideal to tune their impedance, a property that controls the material’s resistance to energy flow, simply by adjusting temperature. That is, how well two reservoirs made of VRH materials exchange heat can be controlled by tuning the impedance of the materials, which in turn can be controlled by tuning their temperature.

In the new study, the researchers reported that impedance matching played a key role. When the reservoirs’ impedances matched perfectly (when their temperatures became equal), the efficiency with which they transferred photonic heat reached a peak. As the materials’ temperatures diverged, heat flow dropped. In fact, the researchers wrote that there was a temperature range, especially as the colder reservoir’s temperature rose to approach that of the warmer one, within which the heat flow increased even as the temperature difference shrank. This effect forms the core of NDTC.

The research team, associated with the NEST initiative at the Istituto Nanoscienze-CNR and Scuola Normale Superiore, both in Pisa in Italy, have proposed a device they call the photonic heat amplifier. They built it using two VRH reservoirs connected by superconducting, lossless wires. One reservoir was kept at a higher temperature and served as the source of heat energy. The other reservoir, called the central island, received heat by exchanging photons with the warmer reservoir.

The proposed device features a central island at temperature T1 that transfers heat currents to various terminals. The tunnel contacts to the drain and gate are positioned at heavily doped regions of the yellow central island, highlighted by a grey etched pattern. Each arrow indicates the positive direction of the heat flux. The substrate is (shown as and) maintained at temperature Tb, the gate at Tg, and the drain at Td. Credit: arXiv:2502.04250v3

The central island was also connected to two additional metallic reservoirs named the “gate” and the “drain”. These points operated with the same purpose as the control and output terminals in an electrical transistor. The drain stayed cold, allowing the amplified heat signal to exit the system from this point. By adjusting the gate temperature, the team could modulate and even amplify the flow of heat between the source and the drain (see image below).

To understand and predict the amplifier’s behaviour, the researchers developed mathematical models for all forms of heat transfer within the device. These included photonic currents between VRH reservoirs, electron tunnelling through the gate and drain contacts, and energy lost as vibrations through the device’s substrate.

(Tunnelling is a quantum mechanical phenomenon where an electron has a small chance of floating through a thin barrier instead of going around it.)

Raring to go

By carefully selecting the device parameters — including the characteristic temperature of the VRH material, the source temperature, resistances at the gate and drain contacts, the volume of the central island, and geometric factors — the researchers said they could tailor the device for different amplification purposes.

They reported two main operating modes. The first was called ‘current modulation amplifier’. In this configuration, the device amplified small variations in thermal input at the gate. In this mode, small oscillations in the gate heat current produced much larger oscillations, up to 15-times greater, in the photon current between the source and the central island and in the drain current, according to the paper. This amplification was efficient down to 20 millikelvin, matching the ultracold conditions required in quantum technologies. The output range of heat current was similarly broad, showing the device’s suitability to amplify heat signals.

The second mode was called ‘temperature modulation amplifier’. Here, slight changes of only a few millikelvin in the gate temperature, the team wrote, caused the output temperature in the central island to swing by as large as 3.3 times the changes in the input. The device could also handle input temperature ranges over 100 millikelvin. This performance reportedly matched or surpassed other temperature amplifiers already reported in the scientific literature. The researchers also noted that this mode could be used to pre-amplify signals in bolometric detectors used in astronomy telescopes.

An important ability relevant for practical use is the relaxation time, i.e. how soon after operating once the device returned to its original state, ready for the next run. The amplifier in both configurations showed relaxation times between microseconds and milliseconds. According to the researchers, this speed resulted from the device’s low thermal mass and efficient heat channels. Such a fast response could make it suitable to detect and amplify thermal signals in real time.

The researchers wrote that the amplifier also maintained good linearity and low distortion across various inputs. In other words, the output heat signal changed proportionally to the input heat signal and the device didn’t add unwanted changes, noise or artifacts to the input signal. Its noise-equivalent power values were also found to rival the best available solid-state thermometers, indicating low noise levels.

Approaching the limits

For these promising results, realising this device involves some significant practical challenges. For instance, NDTC depends heavily on precise impedance matching. Real materials inevitably have imperfections, including those due to imperfect fabrication and environmental fluctuations. Such deviations could lower the device’s heat transfer efficiency and reduce the operational range of NDTC.

The system also banked on lossless superconducting wires being kept well below their critical temperatures. Achieving and maintaining these ultralow temperatures requires sophisticated and expensive refrigeration infrastructure, which adds to the experimental complexity.

Fabrication also demands very precise doping and finely tuned resistances for the gate and drain terminals. Scaling production to create many devices or arrays poses major technical difficulties. Integrating numerous photonic heat amplifiers into larger thermal circuits risks unwanted thermal crosstalk and signal degradation, a risk compounded by the extremely small heat currents involved.

Furthermore, the fully photonic design offers benefits such as electrical isolation and long-distance thermal connections. However, it also approaches fundamental physical limits. Thermal conductance caps the maximum possible heat flow through photonic channels. This limitation could restrict how much power the device is able to handle in some applications.

Then again, many of these challenges are typical of cutting-edge research in quantum devices, and highlight the need for detailed experimental work to realise and integrate photonic heat amplifiers into operational quantum systems.

If they are successfully realised for practical applications, photonic heat amplifiers could transform how scientists manage heat in quantum computing and nanotechnologies that operate near absolute zero. They could pave the way for on-chip heat control, computers to autonomously stabilise the temperature, and perform thermal logic operations. Redirecting or harvesting waste heat could also improve the efficiency and significantly reduce noise — a critical barrier in ultra-sensitive quantum devices like quantum computers.

Featured image credit: Lucas K./Unsplash.

❌
❌