Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

What does it mean to interpret quantum physics?

By: VM
3 September 2025 at 08:01
What does it mean to interpret quantum physics?

The United Nations has designated 2025 the International Year of Quantum Science and Technology. Many physics magazines and journals have taken the opportunity to publish more articles on quantum physics than they usually do, and that has meant quantum physics research has often been on my mind. Nirmalya Kajuri, an occasional collaborator, an assistant professor at IIT Mandi, and an excellent science communicator, recently asked other physics teachers on X.com how much time they spend teaching the interpretations of quantum physics. His question and the articles I’ve been reading inspired me to write the following post. I hope it’s useful in particular to people like me, who are interested in physics but didn’t formally train to study it.


Quantum physics is often described as the most successful theory in science. It explains how atoms bond, how light interacts with matter, how semiconductors and lasers work, and even how the sun produces energy. With its equations, scientists can predict experimental results with astonishing precision — up to 10 decimal places in the case of the electron’s magnetic moment.

In spite of this extraordinary success, quantum physics is unusual compared to other scientific theories because it doesn’t tell us a single, clear story about what reality is like. The mathematics yields predictions that have never been contradicted within their tested domain, yet it leaves open the question of what the world is actually doing behind those numbers. This is what physicists mean when they speak of the ‘interpretations’ of quantum mechanics.

In classical physics, the situation is more straightforward. Newton’s laws describe how forces act on bodies, leading them to move along definite paths. Maxwell’s theory of electromagnetism describes electric and magnetic fields filling space and interacting with charges. Einstein’s relativity shows space and time are flexible and curve under the influence of matter and energy. These theories predict outcomes and provide a coherent picture of the world: objects have locations, fields have values, and spacetime has shape. In quantum mechanics, the mathematics works perfectly — but the corresponding picture of reality is still unclear.

The central concept in quantum theory is the wavefunction. This is a mathematical object that contains all the information about a system, such as an electron moving through space. The wavefunction evolves smoothly in time according to the Schrödinger equation. If you know the wavefunction at one moment, you can calculate it at any later moment using the equation. But when a measurement is made, the rules of the theory change. Instead of continuing smoothly, the wavefunction is used to calculate probabilities for different possible outcomes, and then one of those outcomes occurs.

For instance, if an electron has a 50% chance of being detected on the left and a 50% chance of being detected on the right, the experiment will yield either left or right, never both at once. The mathematics says that before the measurement, the electron exists in a superposition of left and right, but after the measurement only one is found. This peculiar structure, where the wavefunction evolves deterministically between measurements but then seems to collapse into a definite outcome when observed, has no counterpart in classical physics.

The puzzles arise because it’s not clear what the wavefunction really represents. Is it a real physical wave that somehow ‘collapses’? Is it merely a tool for calculating probabilities, with no independent existence? Is it information in the mind of an observer rather than a feature of the external world? The mathematics doesn’t say.

The measurement problem asks why the wavefunction collapses at all and what exactly counts as a measurement. Superposition raises the question of whether a system can truly be in several states at once or whether the mathematics is only a convenient shorthand. Entanglement, where two particles remain linked in ways that seem to defy distance, forces us to wonder whether reality itself is nonlocal in some deep sense. Each of these problems points to the fact that while the predictive rules of quantum theory are clear, their meaning is not.

Over the past century, physicists and philosophers have proposed many interpretations of quantum mechanics. The most traditional is often called the Copenhagen interpretation, illustrated by the Schrödinger’s cat thought experiment. In this view, the wavefunction is not real but only a computational tool. In many Copenhagen-style readings, the wavefunction is a device for organising expectations while measurement is taken as a primitive, irreducible step. The many-worlds interpretation offers a different view that denies the wavefunction ever collapses. Instead, all possible outcomes occur, each in its own branch of reality. When you measure the electron, there is one version of you that sees it on the left and another version that sees it on the right.

In Bohmian mechanics, particles always have definite positions guided by a pilot wave that’s represented by the wavefunction. In this view, the randomness of measurement outcomes arises because we can’t know the precise initial positions of the particles. There are also objective collapse theories that take the wavefunction as real but argue that it undergoes genuine, physical collapse triggered randomly or by specific conditions. Finally, an informational approach called QBism says the wavefunction isn’t about the world at all but about an observer’s expectations for experiences upon acting on the world.

Most interpretations reproduce the same experimental predictions (objective-collapse models predict small, testable deviations) but tell different stories about what the world is really like.

It’s natural to ask why interpretations are needed at all if they don’t change the predictions. Indeed, many physicists work happily without worrying about them. To build a transistor, calculate the energy of a molecule or design a quantum computer, the rules of standard quantum mechanics suffice. Yet interpretations matter for several reasons, but especially because they shape our philosophical understanding of what kind of universe we live in.

They also influence scientific creativity because some interpretations suggest directions for new experiments. For example, objective collapse theories predict small deviations from the usual quantum rules that can, at least in principle, be tested. Interpretations also matter in education. Students taught only the Copenhagen interpretation may come away thinking quantum physics is inherently mysterious and that reality only crystallises when it’s observed. Students introduced to many-worlds alone may instead think of the universe as an endlessly branching tree. The choice of interpretation moulds the intuition of future physicists. At the frontiers of physics, in efforts to unify quantum theory with gravity or to describe the universe as a whole, questions about what the wavefunction really is become unavoidable.

In research fields that apply quantum mechanics to practical problems, many physicists don’t think about interpretation at all. A condensed-matter physicist studying superconductors uses the standard formalism without worrying about whether electrons are splitting into multiple worlds. But at the edges of theory, interpretation plays a major role. In quantum cosmology, where there are no external observers to perform measurements, one needs to decide what the wavefunction of the universe means. How we interpret entanglement, i.e. as a real physical relation versus as a representational device, colours how technologists imagine the future of quantum computing. In quantum gravity, the question of whether spacetime itself can exist in superposition renders interpretation crucial.

Interpretations also matter in teaching. Instructors make choices, sometimes unconsciously, about how to present the theory. One professor may stick to the Copenhagen view and tell students that measurement collapses the wavefunction and that that’s the end of the story. Another may prefer many-worlds and suggest that collapse never occurs, only branching universes. A third may highlight information-based views, stressing that quantum mechanics is really about knowledge and prediction rather than about what exists independently. These different approaches shape the way students can understand quantum mechanics as a tool as well as as a worldview. For some, quantum physics will always appear mysterious and paradoxical. For others, it will seem strange but logical once its hidden assumptions are made clear.

Interpretations also play a role in experiment design. Objective collapse theories, for example, predict that superpositions of large objects should spontaneously collapse. Experimental physicists are now testing whether quantum superpositions survive for increasingly massive molecules or for diminutive mechanical devices, precisely to check whether collapse really happens. Interpretations have also motivated tests of Bell’s inequalities, an idea that shows no local theory with “hidden variables” can reproduce the correlations predicted by quantum mechanics. The scientists who conducted these experiments confirmed entanglement is a genuine feature of the world, not a residue of the mathematical tools we use to study it — and won the Nobel Prize for physics in 2022. Today, entanglement is exploited in technologies such as quantum cryptography. Without the interpretative debates that forced physicists to take these puzzles seriously, such developments may never have been pursued.

The fact that some physicists care deeply about interpretation while others don’t reflects different goals. Those who work on applied problems or who need to build devices don’t have to care much. The maths provides the answers they need. Those who are concerned with the foundations of physics, with the philosophy of science or with the unification of physical theories care very much, because interpretation guides their thinking about what’s possible and what’s not. Many physicists switch back and forth, ignoring interpretation when calculating in the lab but discussing many-worlds or informational views over chai.

Quantum mechanics is unique among physical theories in this way. Few chemists or engineers spend time worrying about the ‘interpretation’ of Newtonian mechanics or thermodynamics because these theories present straightforward pictures of the world. Quantum mechanics instead gives flawless predictions but an under-determined picture. The search for interpretation is the search for a coherent story that links the extraordinary success of the mathematics to a clear vision of what the world is like.

To interpret quantum physics is therefore to move beyond the bare equations and ask what they mean. Unlike classical theories, quantum mechanics doesn’t supply a single picture of reality along with its predictions. It leaves us with probabilities, superpositions, and entanglement, and it remains ambiguous about what these things really are. Some physicists insist interpretation is unnecessary; to others it’s essential. Some interpretations depict reality as a branching multiverse, others as a set of hidden particles, yet others as information alone. None has won final acceptance, but all try to close the gap between predictive success and conceptual clarity.

In daily practice, many physicists calculate without worrying, but in teaching, in probing the limits of the theory, and in searching for new physics, interpretations matter. They shape not only what we understand about the quantum world but also how we imagine the universe we live in.

Who funds quantum research?

By: VM
11 March 2025 at 05:32
Who funds quantum research?

An odd little detail in a Physics World piece on Microsoft’s claim to have made a working topological qubit:

Regardless of the debate about the results and how they have been announced, researchers are supportive of the efforts at Microsoft to produce a topological quantum computer. “As a scientist who likes to see things tried, I’m grateful that at least one player stuck with the topological approach even when it ended up being a long, painful slog,” says [Scott] Aaronson.

“Most governments won’t fund such work, because it’s way too risky and expensive,” adds [Winfried] Hensinger. “So it’s very nice to see that Microsoft is stepping in there.”

In drug development, defence technologies, and life sciences research, to name a few, we’ve seen the opposite: governments fund the risky, expensive part for many years, often decades, until something viable emerges. Then the IP moves to public and private sector enterprises for commercialisation, sometimes together with government subsidies to increase public access. With pharmaceuticals in particular, the government often doesn’t recoup investments it has made in the discovery phase, which includes medical education and research. An illustrative recent example is the development of mRNA vaccines; from my piece in The Hinducriticising the medicine Nobel Prize for this work:

Dr. Kariko and Dr. Weissman began working together on the mRNA platform at the University of Pennsylvania in the late 1990s. The University licensed its patents to mRNA RiboTherapeutics, which sublicensed them to CellScript, which sublicensed them to Moderna and BioNTech for $75 million each. Dr. Karikó joined BioNTech as senior vice-president in 2013, and the company enlisted Pfizer to develop its mRNA vaccine for COVID-19 in 2020.

Much of the knowledge that underpins most new drugs and vaccines is unearthed at the expense of governments and public funds. This part of drug development is more risky and protracted, when scientists identify potential biomolecular targets within the body on which a drug could act in order to manage a particular disease, followed by identifying suitable chemical candidates. The cost and time estimates of this phase are $1billion-$2.5 billion and several decades, respectively.

Companies subsequently commoditise and commercialise these entities, raking in millions in profits, typically at the expense of the same people whose taxes funded the fundamental research. There is something to be said for this model of drug and vaccine development, particularly for the innovation it fosters and the eventual competition that lowers prices, but we cannot deny the ‘double-spend’ it imposes on consumers — including governments — and the profit-seeking attitude it engenders among the companies developing and manufacturing the product.

Quantum computing may well define the next technological revolution together with more mature AI models. Topological quantum computing in particular — if realised well enough to compete with alternative architectures based on superconducting wires and/or trapped ions — could prove especially valuable for its ability to be more powerful with fewer resources. Governments justify their continuing sizeable expense on drug development by the benefits that eventually accrue to the country’s people. By all means, quantum technologies will have similar consequences, following from a comparable trajectory of development where certain lines of inquiry are not precluded because they could be loss-making or amount to false starts. And they will impinge on everything from one’s fundamental rights to national security.

But Hensinger’s opinion indicates the responsibility of developing this technology has been left to the private sector. I wonder if there are confounding factors here. For example, is Microsoft’s pursuit of a topological qubit the exception to the rule — i.e. one of a few enterprises that are funded by a private organisation in a sea of publicly funded research? Another possibility is that we’re hearing about Microsoft’s success because it has a loud voice, with the added possibility that its announcement was premature (context here). It’s also possible Microsoft’s effort included grants from NSF, DARPA or the like.

All this said, let’s assume for a moment that what Hensinger said was true of quantum computing research in general: the lack of state-led development in such potentially transformative technologies raises two (closely related) concerns. The first is scientific progress, especially that it will happen behind closed doors. In a June 2023 note, senior editors of the Physical Review B journal acknowledged the contest between the importance of researchers sharing their data for scrutiny, replication, and for others to build on their work — all crucial for science — and private sector enterprises’ need to protect IP and thus withhold data. “This will not be the last time the American Physical Society confronts a tension between transparency and the transmission of new results,” they added. Unlike in drug development, life sciences, etc., even the moral argument that publicly funded research must be in the public domain is rendered impotent, although it can still be recast as the weaker “research that affects the public sphere…”.

The second is democracy. In a March 2024 commentary, digital governance experts Nathan Sanders, Bruce Schneier, and Norman Eisen wrote that the state could develop a “public AI” to counter the already apparent effects of “private AI” on democratic institutions. According to them, a “public AI” model could “provide a mechanism for public input and oversight on the critical ethical questions facing AI development,” including “how to incorporate copyrighted works in model training” and “how to license access for sensitive applications ranging from policing to medical use”. They added: “Federally funded foundation AI models would be provided as a public service, similar to a health care private option. They would not eliminate opportunities for private foundation models, but they would offer a baseline of price, quality, and ethical development practices that corporate players would have to match or exceed to compete.”

Of course, quantum computing isn’t beset by the same black-box problem that surrounds AI models, yet what it implies for our ability to secure digital data means it could still benefit from state-led development. Specifically: (i) a government-funded technology standard could specify the baseline for the private sector to “match or exceed to compete” so that computers deployed to secure public data maintain a minimum level of security; (ii) private innovation can build on the standard, with the advantage of not having to lay new foundations of their own; and (iii) the data and the schematics pertaining to the standard should be in the public domain, thus restricting private-sector IP to specific innovations.[1]


[1] Contrary to a lamentable public perception, just knowing how a digital technology works doesn’t mean it can be hacked.

Majorana 1, science journalism, and other things

By: VM
28 February 2025 at 06:42
Majorana 1, science journalism, and other things

While I have many issues with how the Nobel Prizes are put together as an institution, the scientific achievements they have revealed have been some of the funnest concepts I’ve discovered in science, including the clever ways in which scientists revealed them. If I had to rank them on this metric, the first place would be a tie between the chemistry and the physics prizes of 2016. The chemistry prize went to Jean-Pierre Sauvage, Fraser Stoddart, and Ben Feringa for “for the design and synthesis of molecular machines”. Likewise, the physics prize was shared between David Thouless, Duncan Haldane, and John Kosterlitz “for theoretical discoveries of topological phase transitions and topological phases of matter”. If you like, you can read my piece about the 2016 chemistry prize here. A short excerpt about the laureates’ work:

… it is fruitless to carry on speculating about what these achievements could be good for. J. Fraser Stoddart, who shared the Nobel Prize last year with Feringa for having assembled curious molecular arrangements like Borromean rings, wrote in an essay in 2005, “It is amazing how something that was difficult to do in the beginning will surely become easy to do in the event of its having been done. The Borromean rings have captured our imagination simply because of their sheer beauty. What will they be good for? Something for sure, and we still have the excitement of finding out what that something might be.” Feringa said in a 2014 interview that he likes to build his “own world of molecules”. In fact, Stoddart, Feringa and Jean-Pierre Sauvage shared the chemistry prize for having developed new techniques to synthesise and assemble organic molecules in their pursuits.

In the annals of the science Nobel Prizes, there are many, many laureates who allowed their curiosity about something rather than its applications to guide their research. In the course of these pursuits, they developed techniques, insights, technologies or something else that benefited their field as a whole but which wasn’t the end goal. Over time the objects of many of these pursuits have also paved the way for some futuristic technology themselves. All of this is a testament to the peculiar roads the guiding light of curiosity opens. Of course, scientists need specific conditions of their work to be met before they can commitment themselves to such lines of inquiry. For just two examples, they shouldn’t be under pressure to publish papers and they shouldn’t have to worry about losing their jobs if they don’t file patents. I can also see where the critics of such blue-sky research stand and why: while there are benefits, it’s hard to say ahead of time what they might be and when they might appear.

This said, the work that won the 2016 physics prize is of a similar nature and also particularly relevant in light of a ‘development’ in the realm of quantum computing earlier this month. Two of the three laureates, Thouless and Kosterlitz, performed an experiment in the 1970s in which they found something unusual. To quote from my piece in The Hindu on February 23:

If you cool some water vapour, it will become water and then ice. If you keep lowering the temperature until nearly absolute zero, the system will have minimal thermal energy, allowing quantum states of matter to show. In the 1970s, Michael Kosterlitz and David Thouless found that the surface of superfluid helium sometimes developed microscopic vortices that moved in pairs. When they raised the temperature, the vortices decoupled and moved freely. It was a new kind of … phase transition: the object’s topological attributes changed in response to changes in energy [rather than it turning from liquid to gas].

The findings here, followed by many others that followed, together with efforts by physicists to describe this new property of matter using mathematics, in harmony with other existing theories of nature all laid the foundation for Microsoft’s February 19 announcement: that it had developed a quantum-computing chip named Majorana 1 with topological qubits inside. (For more on this, please read my February 23 piece.) Microsoft has been trying to build this chip since at least 2000, when a physicist then on the company’s payroll named Alexei Kitaev published a paper exploring its possibility. Building the thing was a tall order, requiring advances in a variety of fields that eventually had to be brought together in just the right way, but Microsoft knew that if it succeeded the payoff would be tremendous.

This said, even if this wasn’t curiosity-driven research on Microsoft’s part, such research has already played a big role in both the company’s and the world’s fortunes. In the world’s fortune because, as with the work of Stoddart, Feringa, and Sauvage, the team explored, invented and/or refined new methods en route to building Majorana 1, methods which the rest of the world can potentially use to solve other problems. And in the company’s fortune because while Kitaev’s paper was motivated by the possibility of a device of considerable technological and commercial value, it drew from a large body of knowledge that — at the time it was unearthed and harmonised with the rest of science — wasn’t at all concerned with a quantum-computing chip in its then-distant future. For all its criticism, blue-sky research leads to some outcomes that no other forms of research can. This isn’t an argument in support of it so much as in defence of not sidelining it altogether.

While I have many issues with how the Nobel Prizes are put together as an institution, I’ve covered each edition with not inconsiderable excitement[1]. Given the fondness of the prize-giving committee for work on or with artificial intelligence last year, it’s possible there’s a physics prize vouchsafed for work on the foundations of contemporary quantum computers in the not-too-distant future. When it comes to pass, I will be all too happy to fall back on the many pieces I’ve written on this topic over the years, to be able to confidently piece together the achievements in context and, personally, to understand the work beyond my needs as a journalist, as a global citizen. But until that day, I can’t justify the time I do spend reading up about and writing on this and similar topics as a journalist in a non-niche news publication — one publishing reports, analyses, and commentary for a general audience rather than those with specialised interests.

The justification is necessary at all because the time I spend doing something is time spent not doing something else and the opportunity cost needs to be rational in the eyes of my employers. At the same time, journalism as a “history of now” would fail if it didn’t bring the ideas, priorities, and goals at play in the development of curiosity-driven research and — with the benefit of hindsight — its almost inevitable value for commerce and strategy to the people at large. This post so far, until this point, is the preamble I had in mind for my edition of The Hindu’s Notebook column today. Excerpt:

It isn’t until a revolutionary new technology appears that the value of investing in basic research becomes clear. Many scientists are rooting for more of it. India’s National Science Day, today, is itself rooted in celebrating the discovery of the Raman effect by curiosity-driven study. The Indian government also wants such research in this age of quantum computing, renewable energy, and artificial intelligence. But it isn’t until such technology appears that the value of investing in a science journalism of the underlying research — slow-moving, unglamorous, not application-oriented — also becomes clear. It might even be too late by then.

The scientific ideas that most journalists have overlooked are still very important: they’re the pillars on which the technologies reshaping the world stand. So it’s not fair that they’re overlooked when they’re happening and obscured by other concerns by the time they’ve matured. Without public understanding, input, and scrutiny in the developmental phase, the resulting technologies have fewer chances to be democratic, and the absence of the corresponding variety of journalism is partly to blame.

I would have liked to include the preamble with the piece itself but the word limit is an exacting 620. This is also why I left something else unsaid in the piece, something important for me, the author, to have acknowledged. After the penultimate line — “You might think just the fact that journalists are writing about an idea should fetch it from the fringes to the mainstream, but it does not” — I wanted to say there’s a confounding factor: the skills, choices, and circumstances of the journalists themselves. If a journalist isn’t a good writer[2] or doesn’t have the assistance of good editors, what they write about curiosity-driven research, which already runs on weak legs among the people at large, may simply pass through their feeds and newsletters without inviting even a “huh?”. But as I put down the aforementioned line, a more discomfiting thought erupted at the back of my mind.

In 2017, on the Last Word on Nothing blog, science journalist Cassandra Willyard made a passionate case for the science journalism of obscure things to put people at its centre in order to be effective. The argument’s allure was obvious but it has never sat well with me. The narrative power of human emotion, drawn from the highs or lows in the lives of the people working on obscure scientific ideas, is in being able to render those ideas more relatable. But my view is that there’s a lot out there we may never write about if we couldn’t also write about what highs/lows it rendered among its discoverers or beholders, and more so if such highs/lows don’t exist at all, as is often the case with a big chunk of curiosity-driven research. Willyard herself had used the then-recent example of the detection of gravitational waves from two neutron stars smashing into each other billions of lightyears away. This is conveniently (but perhaps not by her design) an example of Big Science where many people spent a long time looking for something and finally found it. There’s certainly a lot of drama here.

But the reason I call having to countenance Willyard’s arguments discomfiting is that I understand what she’s getting at and I know I’m rebutting it on the back of only a small modicum of logic. It’s a sentimental holdout, even: I don’t want to have to care about the lives of other people when I know I care very well for how we extracted a world’s worth of new information by ‘reading’ gravitational waves emitted by a highly unusual cosmic event. The awe, to me, is right there. Yet I’m also keenly aware how impactful the journalism advocated by Willyard can be, having seen it in ‘action’ in the feature-esque pieces published by science magazines, where the people are front and centre, and the number of people that read and talk about them.

I hold out because I believe there are, like me, many people out there (I’ve met a few) that can be awed by narratives of neutron-star collisions that dispense with invoking the human condition. I also believe that while a large number of people may read those feature-esque pieces, I’m not convinced they have a value that goes beyond storytelling, which is of course typically excellent. But I suppose those narratives of purely scientific research devoid of human protagonists (or antagonists) would have to be at least as excellent in order to captivate audiences just as well. If a journalist — together with the context in which they produce their work — isn’t up to the mark yet, they should strive to be. And this striving is essential if “you might think just the fact that journalists are writing about an idea should fetch it from the fringes to the mainstream, but it does not” is to be meaningful.


[1] Not least because each Nobel Prize announcement is accompanied by three press releases: one making the announcement, one explaining the prize-winning work to a non-expert audience, and one explaining it in its full technical context. Journalism with these resources is actually quite enjoyable. This helps, too.

[2] Im predominantly a textual journalist and default to write when writing about journalistic communication. But of course in this sentence I mean journalists who arent good writers and/or good video-makers or editors and/or good podcasters, etc.

❌
❌