m (Created page with "Our work with Eduardo Zubizarreta Casalengua and Camilo Lòpez Carreño on [https://onlinelibrary.wiley.com/doi/full/10.1002/lpor.201900279 Conventional and Unconventi...")
Our work with Eduardo Zubizarreta Casalengua and Camilo Lòpez Carreño on Conventional and Unconventional Photon Statistics made the cover of Laser & Photonics Reviews (our second cover since PRL's macroscopic condensates). The artwork is from Carlos Sánchez Muñoz and represents the basic idea of the paper, which is explained in this blog post, although it is self-explanatory for those familiar with the field. It is also simple enough that everybody can understand what it shows.
The idea is indeed simple: if you mix a coherent laser beam (the red, solid beam coming from the right) with the output from a quantum source (depicted by Carlos as a glowing slab of something, presumably a quantum well, bathed in the confined blueish radiation of a pillar cavity), you can get, on one harm of a beam-splitter (the cube) that introduces the two fields to each other, a much more quantum signal. That is, the fuzzy little balls—photons—glued in some sort of jelly which is the coherent, i.e., classical part of the signal (a fainter version of the other bright laser beams), get stripped of this background and emerge from the beam-splitter as pure, single, naked photons, the "quantum juice" in all its glory: the pure quantum light. On the other harm of the beam-splitter, you get an even soaker version of photons stuck in thick coherent marmalade. The scheme is a sort of quantum-cleanser: it removes the classical part, and this is what the sketch aims to show (?!).
There is more to it than the sketch can convey. In particular, and counter-intuitively, the photons bathed in the background radiation happen to be bunched, i.e., cluttered together, from a typical output of a quantum source, and this is only after removing the coherent radiation, itself neither bunched nor antibunched, that it yields the antibunched photons, i.e., separated from each others. I say counter-intuitively and that should indeed seem weird at first that
But a second's thought about it shows that it is nothing more than the well-known, equally paradoxical at first but easily understood:
The latter is the famous interference from optics, well-known since Young and who ceases to surprise after the 19th century and high school at the latest. The former is its two-photon counterpart. In both cases, you need of course to chose the phase between the two fields right, so that, at the one photon (wave) level, the antinodes are opposite, and, at the two-photon level, the amplitudes of the two-photon component of each field are also opposite. In fact just as you can produce destructive and constructive interferences in classical outputs, in this way, merely varying the phase, you can suppress or enhance the two-photon contribution, resulting in antibunching or superbunching. This shows what happens when adding an out-of-phase coherent field to a output of a two-level system (2LS) that is coherently driven (so-called Rayleigh regime or Heitler regime):
When the amplitude is zero, meaning you're not adding anything, you get the antibunching of the two-level system itself. As you increase the amplitude, you spoil that, and swapping over, you find two resonances: the constructive two-photon interference, superbunching, and the destructive two-photon interference, antibunching.
Why is this simple effect, basic idea, important? Because it's basically everywhere you care to look at in the output of coherently driven systems in their weak-excitation limit. And there is a lot of fundamental physics involved. The two antibunching peaks of the above picture, for instance, behave very differently if you look at other photon-numbers, i.e., at the three-photon, four-photon, etc., statistics. This led us to a classification of conventional and unconventional photon statistics, named after a popular terminology of conventional and unconventional photon blockade (you can replace photon by whatever: polariton, plasmon, etc.) When the driving is very small, it's always a variation of this idea, however. And we find in the literature a huge, really considerable amount of works all looking at one aspect or another of such an effect, without connecting to the underlying interference. When we made this connection, we produced a big text (that we called between ourselves "the mother paper) to cover for sufficiently different platforms to show that this was something universal. We made a silly mistake of submitting it with its TOC on the first page, that we thought would help to navigate through the long text.
This triggered the immediate reaction, in particular from two of our Referees, that the text looked like a review and would better be published as such. That was a disappointment, of course, since although it was reproducing known results, and that several bits had already been touched upon from various Authors, the bulk of the findings were original. Even the editor was inviting us to submit as a Review somewhere else. So we did. And of course, what happened is that when we submitted it in this form, we got the opposite feedback that much of it appeared to be original research. We've been lucky though that both those thinking it was a review were recognizing some novelty that could be published in PRA and those thinking it was not a review agreed it was bringing things together, so could be published as a review provided another text would extract the novelty part. So we went for a dual submission.