Decoherence is known to play a dominant role in physics, from condensed matter to particle physics, to the very conditions needed to create the density fluctuations that determine the form of our Universe. It is also crucial to understanding the role that quantum mechanical coherence plays in biology, and has been suggested as the key to our understanding the interpretation of quantum mechanics itself. As it covers all fields of physics, and is critical to our understanding of the role that quantum mechanics plays in all areas, both experimentalists and theoreticians have been struggling with decoherence since the inception of quantum theory. From the practical perspective, the former are trying today to suppress it in order to observe quantum phenomena on a macroscopic scale. Such attempts are essential, for example, to the construction of large-scale quantum computers. From the theoretical perspective, the latter are still arguing with philosophers about the role decoherence plays in the foundations of quantum theory. Discussions on decoherence thus have a great impact on a broad community of physicists and philosophers of science. This Theme Issue brings together members of both communities and highlights some of the technical and conceptual issues that the modern studies of decoherence have revealed.

We begin with the theoretical perspective. Here, we explore two ideas that are relatively less known: (i) the possibility of decoherence without dissipation, either in the context of the comparison between environmental and ‘intrinsic’ decoherence or in the context of the black hole information loss paradox, and (ii) the idea of ‘false decoherence’, i.e. when a system looks mixed on a short time scale when it is actually pure.

Stamp [1], a condensed matter physics expert, analyses the differences between models of environmentally induced decoherence, and the more controversial models of ‘intrinsic’ decoherence (sometimes connected with wave function collapse; but here it is rather different, being defined as an intrinsic property of Nature). The oscillator bath and the spin bath models are presented as examples of the former, along with a discussion on the open questions and the relation of these models to experiments. He then suggests a theoretical framework for intrinsic decoherence, exemplifying it with a model in which the latter is gravitationally induced. The discussion is crucial for the understanding of the mechanisms behind decoherence, which is a prerequisite for any future tests of the universality of quantum theory.

Unruh [2], a theoretical physicist and a cosmologist, discusses further the notion of decoherence without dissipation that appears, for example, in the spin bath models presented by Stamp, and applies it to the notorious black hole information loss paradox. The main lesson from his first short paper is that, the received wisdom notwithstanding, decoherence can occur—in condensed matter physics as well as in gravitational systems such as black holes—without energy loss. Unruh's [3] second short contribution to this Theme Issue concerns a special example of ‘false decoherence’, when a quantum system only ‘looks’ classical, or mixed, as a result of a measurement process carried out too quickly. This fact has surprising consequences, for example, in temperature measurements on an oscillator strongly coupled to a heat bath.

Turning to the experimental front, we touch upon control issues in quantum optics and on tunnelling phenomena in mesoscopic physics.

Milburn [4], an expert in quantum optics and control, discusses the importance of decoherence for the classical control of quantum systems. In this context, the notorious quantum measurement problem and its putative solution with decoherence assume a practical guise: the classical controllability of a quantum system also requires us to make explicit reference to the open and dissipative nature of control devices. The paper takes its cue from a model, proposed by Diosi, for reconciling classical gravitation with quantum dynamics, and shows that similar models arise when an open quantum system is used as a classical controller for another target quantum system.

Barbara [5], an experimentalist in mesoscopic physics, reviews the current state of tunnelling experiments in large spin systems, and discusses the origin of hysteresis and of decoherence in those systems. He shows why the classical and quantum dynamics of an effective spin 1/2 are the same, and how the Landau–Zener model can be treated classically. This provides the possibility to explain the probabilistic origin of hysteresis in these quantum systems, and to give a classical interpretation of different phenomena associated with ensembles of spins, including quantum tunnelling, relaxation and coherence. He then analyses the main decoherence mechanisms of large spin systems and extends the analysis to other types of qubits.

The application of decoherence theory to other domains such as biology, informatics and cosmology is exemplified with three essays on its role in biological systems, in quantum information theory and in quantum gravity.

The idea that decoherence plays a role in biological systems is explored by the theoretical physicists Tiersch and Briegel [6]. The starting point for this investigation is that the behaviour of macroscopic things and beings may depend in a much more direct way on the explicit quantum nature of processes at the molecular level. The example they give is the context of bird navigation, in which certain organisms, such as the European robin (*Erithacus rubecula*), demonstrate a remarkable sensitivity to the Earth's magnetic field. Their paper analyses the main physical process believed to be responsible for this sensitivity, from the perspective of decoherence theory and the dynamics of open quantum systems. The generic models they discuss seem to indicate the extent to which the behaviour of macroscopic—and, in this case, living—objects may be influenced by quantum mechanical processes such as decoherence.

Raussendorf [7], an expert in quantum information theory, reviews the exciting domain of quantum error correction, in which decoherence is believed to pose the largest obstacle to the realization of a large-scale, fault-tolerant and computationally superior quantum computer. Presenting the key building blocks of the theory of quantum error correction (namely the quantum circuit model, the adaption of the classical theory of error correction to the quantum regime, and the idea of fault tolerance and the main result of the threshold theorems), the paper admits that the theory of quantum error correction is way ahead of its experimental verification, but ends optimistically, noting that several error-correcting protocols have been to date realized in NMR systems and ion traps. In this context, it is worth mentioning that most, if not all, results in this field were derived within the restrictive assumption of the oscillator bath models, in which correlations between errors decay exponentially fast; the status of fault tolerance in the alternative spin bath models, which allow highly correlated errors to persist, is still an open question.

Kiefer [8], a theoretician in quantum cosmology, argues for the role of decoherence in the emergence of the classical world. Assuming the universality of quantum theory, and in particular the universality of the linearity of Schrödinger's equation and the superposition principle, he applies it to the Universe as a whole, as a closed system. In such a setting of quantum cosmology, one writes the wave function of the Universe, and separates relevant from irrelevant degrees of freedom—the scale factor of the Universe and a homogeneous background field, from the small density fluctuations—and shows how, owing to the interaction between the former and the latter, the relevant degrees of freedom decohere into their classical values. Further considerations lead to a kind of ‘consistency proof’ of quantum gravity with the classical picture of the Universe, and in particular with the ubiquitous phenomenon of thermodynamic irreversibility.

Finally, we discuss the conceptual and methodological issues that surround decoherence, focusing on its role in physics and in the philosophy thereof.

That decoherence is instrumental to the foundational resolution of the quantum measurement problem is the starting point for Wallace [9], a philosopher of physics, who clarifies the exact role decoherence may play in attempts to give a dynamical analysis to the measurement process, and in so doing reformulates the measurement problem to better fit contemporary scientific practice. He then analyses the role decoherence plays in each of the possible solutions (and dissolutions) of the problem, either as constraint on those approaches that aim to modify quantum theory or as a consistency requirement between the underlying quantum dynamics and the apparent classical behaviour. The interpretation that is ‘best served’ by decoherence is, according to Wallace, that of Everett's many worlds.

Hagar [10], another philosopher of physics, starts with a brief historical account on decoherence and its conceptual roots in the foundations of statistical mechanics, which explains the initial emphasis on the role of dissipation in decoherence models. By trying to pinpoint the philosophical importance of decoherence, Hagar locates it in the ongoing attempts to test the universality of quantum theory, where one is presented with an intriguing double bind: on the one hand, decoherence manifests this universality, as the dynamical process itself is triggered by entanglement, and, on the other hand, it is the major obstacle in testing this universality and the ubiquity of entanglement in all scales.

## Acknowledgements

As the guest editor of this Theme Issue, I wish to thank Lee Gohlike, the owner of the Outing Lodge (MN), where the 14th Seven-Pines Symposium on Decoherence and Entanglement took place in May 2010. Some of the papers included in this Theme Issue were presented and discussed in that meeting under his generous hospitality. I am also grateful to the referees for their service, to the Editor of *Philosophical Transactions of the Royal Society A* for agreeing to the idea of this Theme Issue, and to Suzanne Abbott (publishing editor) for her support and patience throughout, which helped me in maintaining coherence on such a long time scale.

## Footnotes

One contribution of 11 to a Theme Issue ‘Decoherence’.

- This journal is © 2012 The Royal Society