The Indiscrete Quantum

Did Thomas Kuhn make a continuity error?


Keith S. Taber


Would you like to read a science joke?

At least, I think this was intended as a joke:

"A month after his return from Brussels, Poincaré [Henri Poincaré] presented to the French Academy of Sciences the first version of his own detailed proof of the necessity of discontinuity. The full version followed in January 1912, after which date French publication on the quantum, if initially sparse, is nevertheless continuous."

Kuhn, 1978/1887 (emphasis added)

This is from a book on the history of science, and, in particular, one detailing the slow process by which the notion that energy is quantised became established in physics. Quantised, here, means existing as discrete quanta – coming in distinct lumps so to speak – in the way that coinage does, but, say, tap water does not [seem to].

The author, Thomas Kuhn, is most famous for his theory of how science tends to occur within established traditions, so-called paradigms, interrupted by the occasional 'revolutionary' paradigm-shift. At the level of the individual scientist, a paradigm-shift requires the kind of gestalt-switch involved in switching one's perception of an ambiguous image.


An ambiguous figure (Image by ElisaRiva from Pixabay).

"The figure might be seen as two faces (shown in white against a black background). Or the same image (i.e., the same perceptual data) could be seen – that is interpreted – as some kind of goblet or candlestick holder (in black against a white background).

A person can learn to see either version, but not both at the same time. The brain actively organises perception to make sense of the image, and the viewer can force the 'Gestalt-shift' between the two interpretations…."

Taber, 2023, p.197

That is, there is a clear discontinuity in thinking. Understanding the earth in space as one planet among others orbiting a star requires a drastic reorganisation of thinking from seeing the earth as the centre of a universe which all revolves about it. That switch may be the outcome of a lot of deliberation and reflection – but the two paradigms are, Kuhn claimed, incommensurable. That did not mean it was not possible to compare competing paradigms, but rather that each mind-set reflected a particular perspective, and there is no neutral ground that allows a completely objective comparison.

Scientists may only slowly construct a new way of conceptualising a phenomenon or field (Thagard, 1992), but once they have, they can 'see' matters completely differently. Ironically, perhaps, Kuhn suggests that the more historical research reveals the details of how such shifts come about the more the discontinuity can be seen as the outcome of a continuous development in the scientist's thinking,

"…as almost always happens in historical reconstruction, the new narrative is more nearly continuous than its predecessor."
p.354

Kuhn, 1984/1987

So, the paradigm-shift is a kind of quantum jump in thinking. But this discontinuity is also the outcome of a gradual, continuous process. If that seems a contradiction, consider what happens if someone blows up a balloon – in both senses of 'blow up'! Inflating the balloon is a continuous process where the balloon slowly gets larger – until it 'pops'.


A balloon can be inflated to differing degrees – and can be stable at many degrees of inflation as the elastic skin stretches, creating an increasing tension that balances the effect of the increasing (excess) gas pressure inside.

However, the same process that continously inflates the balloon through a sequence of myriad possible stable states eventually leads to an internal pressure greater than the skin can tolerate, and there is a sudden catastrophic shift to a new, quite different, state (i.e., a burst balloon)

(Image by RAHIL GUPTA from Pixabay)


Kuhn was a physicist who become involved in teaching about the history of science at Harvard, and transitioned to become a historian of science. He wrote a book about the Copernican revolution (1957) – the shift from seeing the universe as earth-centred to appreciating that the earth was one subsidiary part of a sun-centred system – but his ideas about science in general progressing through such dramatic shifts were widely criticised. 1

One of the criticisms was that his model only reflected a limited number of examples of developments in physics, and could not be generalised as applying to science as a matter of course. Kuhn had not trained as a historian and his book on the Copernican revolution largely derived from his reading of secondary sources he had used in preparing lectures. (Many valuable books are primarily based on synthesising other secondary sources – but original works of academic history are based on detailed engagement with original sources such as the original scientific publications, as well as letters, diary entries, and the like.)

Kuhn's book on the 'quantum discontinuity' was intended to be serious historical scholarship. It is a detailed and evidenced account of how an idea introduced into physics as a kind of mental trick – if not perhaps a desperate stop-gap measure – slowly become accepted as actually reflecting a key aspect of the fundamental nature of the physical world.2

A second quantum revolution

Yet this was not the first quantum revolution in science. It was already widely accepted that matter was quantised – the apparently continuous nature of an iron bar or a drop of water was understood to reflect the emergence at a perceptible scale of properties that were due to the interactions of myriad tiny discrete parts. (We generally say matter is made up of tiny particles, but these are not like tiny ball bearings, but more like fuzzy concentrations of electrical fields – quanticles.)

Something that everyone today learns in school – about matter comprising of molecules and ions, that are themselves made up of protons, neutrons and electrons – was once a seemingly bizarre and wild conjecture. Even into the Twentieth Century some scientists saw the atomic hypothesis (sic) as only a useful explanatory device which did not offer a realistic description of how the world was actually structured.

Indeed, even today, it is widely considered that one of the most challenging aspects of introductory chemistry is appreciating how models of the structure and properties of matter at the nano-scale of quanticles – ions, molecules, electrons – are used to explain the very different structures and properties of matter at the familiar everyday scale (Taber, 2013).


Chemistry is a science which explains familiar phenomena through models of submicroscopic quanticles that behave in unfamiliar ways (Figure from Taber, 2013)

But scientists came to accept the 'atomic hypothesis' because it explained many phenomena, and proved successful in making predictions – leading to conjectures that could be successfully tested.

The continuous and the discrete

The distinction between what is continuous and what is discrete is generally marked in English in that mass nouns are applied to entities that are seen as continuous. So, at the perceptible scale, air and water are examples of things that are continuous. In English, then, the words 'air' and 'water' are mass nouns rather than count nouns used for countable entities (e.g., coins, chairs, plates, schools).

We can say:

  • I need some air
  • let's get a little more air
  • would you like some water?
  • I have drunk too much water

but not (usually)

  • I need another air
  • let's gets many more airs
  • Would you like any waters?
  • I have drunk too many waters

(Ironically, in the circumstances, we teach children that air is a mixture of substances comprising discrete molecules, and that water is a substance comprised of a great many molecules that are attracted to each other. So, we can refer to another molecule, or many more molecules, of water!)

  • One can do much research ('research' is a mass noun), but one cannot do many research. Although one can undertake many studies ('study 'is a count noun), which amounts to much research.
  • A person can be quite sad, or very sad (or not sad at all), but is not said to have fewer sads or more sads (or to have no sads).
  • On the other hand, a person may have many books, but not much book. Although we might say the person had much literature (not many literatures).'Book' is a count noun, where 'literature' is a mass noun. Publications (articles, books, chapters, conference papers, posters) are discrete – and both the references cited at the end of an academic work, and the individual scholar's record of publication, take the form of lists.

Of course, language is fluid (so to speak! – like water and air), and can change and become stretched or modified. We now see references to people who have done 'many researches', even if this is not (yet at least) standard use. Whilst, in English, data is plural (some data, many data), and the singular is datum (the datum, a datum) the widespread use of data as a mass noun (much data)3 has led to the respected newspaper The Financial Times giving in to the mob and deciding to use 'data' for 'datum' in future.

"…in the last few weeks something has happened that has shaken our very civilisation and made walls come tumbling down. The Financial Times…has announced that according to their style guide, henceforth 'data' will be a singular noun."

Tim Harford presenting the BBC radio programme/podcast 'More or Less' episode 'Nurses' pay, ambulance times and forgotten female economists' (Released On: 15 Feb 2023)

The quantum theory

When scientists and physics teachers refer to the quantum theory they tend to mean not the quantisation of matter, but of energy. The idea that energy might be quantised seemed counter-intuitive – even if matter came in lumps, energy was not 'stuff' and was surely continuous in nature. Energy was seen to be more like sadness than coinage.

Yet there were problems. The 'ultraviolet catastrophe' referred to how theory suggested that a 'black-body' radiator (an ideal radiator) should emit a spectrum of radiation that showed ever increasing energy output in moving to higher frequencies. That did not happen. This was not just because real radiators could only be expected to approximate to an ideal model – the discrepancy was extreme. A real hot body gave an emission spectrum with a maximum peak, and then decreasing power output at increasing frequencies (lower wavelengths); whereas theory predicted a continuously rising curve. (See the figure below.) Clearly, if hot bodies had been able to radiate with infinite power they would have cooled instantly then no one could have ever made a decent cup of tea. (Although that may not have ben the most serious concern.)


Classical theory predicted something bizarre: that thermal radiators would emit with infinite power, with the energy output increasing with increasing frequency (i.e., decreasing wavelength – read the graph from right to left) of radiation (as shown by the black curve for a body at 5000K according to classical theory). The spectra of actual radiators (other curves) have maxima, beyond which the output drops at higher frequencies (and so the area under a curve – and the energy radiated – is finite).
(Image source: By Darth Kule – Own work, Public Domain, wikimedia)

The theoretically derived spectrum was found to be quite well-matched to empirical results obtained at low enough frequencies, but once into the ultraviolet region the predictions and experimental results diverged considerably, and in a way that become even more extreme as frequency increased. Thus the term ultraviolet catastrophe.

For that matter, the simple model of the atom with orbiting electrons did not fit classical theory which predicted that an oscillating electrical charge should emit radiation (and in doing so shift to a lower energy state). These radiating atoms should (on this theory) collapse as electrons spiralled into their nuclei. Moreover, this should also happen on a very short time-scale. Clearly matter did not behave as theory predicted. There was something of a crisis in physics.

Quantum hypotheses

The idea that when matter interacts with radiation there might be restrictions on the magnitude of energy changes involved was first introduced as a kind of mathematical 'fix' to bring the theory into line with the empirical data (plural!) Only slowly did it become accepted that this was not just some mathematical trick, but part of an authentic description of how the universe actually appears to be: when a body absorbs or emits radiation this happens in discrete quanta.

That is, what was invented as a thinking tool, became seen as having significant physical significance,

"…the changed meaning of the quantity h𝜈 from

a mental subdivision of the energy continuum to

a physically separable atom of energy."

Kuhn, 1984/1987 {emphasis added}

When your desk lamp emits radiation the number of quanta involved is enormous, so it seems a continuous process – just as the shade appears to be a continuous lump of material rather than a vast conglomeration of molecules. Our experience of the effect of turning on a lamp is like observing a beach from such a great distance that the sand looks like a continuous entity – even though we know that if we went on the beach and looked very closely we would see the sand comprised of myriad tiny grains. (Tiny on our scale – still enormous on the scale of individual atoms.)


Seen from a distance, sand on a beach seems a continuous material, but under the right viewing conditions we can see it is particulate – comprised of discrete grains
(Image by Marcel from Pixabay)

But in the photoelectric effect, where the absorption of radiation can change the electrical properties of certain materials, it becomes clear that the radiation is not being absorbed as a continuous flow of energy from the radiation field, but as series of discrete events. There is a threshold frequency of light below which the effect does not occur and the radiation has no effect on the electrical properties of the material. No matter how much we turn up the intensity of light, even if we are only just below the threshold frequency, we do not see the photoelectric effect as it relies on the radiation arriving in energy-packets that are individually large enough to have an effect at the atomic level.

Consider two children standing by a garden fence behind which the neighbours are having a wild party. Imagine the brother is just too short to see over the fence, but the sister is slightly taller – above the threshold to see over the fence. The taller child observes the crazy events next door, and the longer she watches, the more she observes. However, the shorter child observes nothing. Even if he stands there for the entire evening, he will not get a glimpse, whereas the taller companion sees something – and so sees more than her brother – even if she gets bored and goes away after just a few minutes.


A visual analogy for the kind of threshold that occurs in the photoelectric effect. There is a minimum frequency of radiation (and so minimum energy quantum) needed to trigger the photoelectric effect – just as there is minimum height needed to see over the fence such that a constantly growing child will fairly rapidly transition from being too short (seeing nothing) to tall enough (seeing all there is to see).

Albert Einstein explained this as the radiation being comprised of particle-like packets of energy – the photons. Again this was, initially, widely seen as a heuristic, a way of moving research forward – and so putting aside, at least for the time being, a conceptual problem. But over time it came to be understood as something fundamental about the nature of radiation. Light, and other electromagnetic radiation, may have wave-like aspects, but a full description has to account for its particle-like nature as well.

A physics pun?

Kuhn's book on 'Black-Body Theory and the Quantum Discontinuity' is centrally about the way this idea of radiated energy being discontinuous gradually moved

  • from being a stop-gap heuristic (i.e., treat energy as quantised for the moment) of the kind scientists often use to allow them to make progress despite a conceptual problem they will need to return to at some point
    • (to put an inconsistency into 'quarantine' to use the metaphor suggested by Imre Lakatos)
  • to become seen as a fundamental truth about the nature of the universe: energy, in interacting with matter, is quantised.

In this sense, energy is more like coins than sadness– more like studies than research.

And, in physics, studies are generally published in the research literature as 'papers' – reports of research, each described in a discrete and self-contained article.4 The wider literature also includes books and book chapters and conference papers – but these are also all discrete entities, even when they collectively reflect an author's slowly shifting perspective on some topic.

As Kuhn was centrally writing about the transition from radiation understood as something continuous to radiation understood as energy quantised in discrete photons, something discontinuous, he was clearly well aware of this distinction. Indeed, the 'Quantum Discontinuity' of his title was itself a kind of pun, that could refer either

  • to how the quantisation physics describes reflects a discontinuous process – or
  • to the paradigm shift in understanding among the physics community in accepting quantisation as a physical description.

There was a discontinuity in both developing scientific thinking about the physics, and in the physical nature of the universe itself.

'French publication' was not continuous

"French publication" on the quantum, whether sparse or dense, would have occurred through a sequence of discrete publications (and, so, as distinct events separated in time by intervals). "French publication" on the quantum – however, we might be able to describe these events: infrequent, occasional, regular, sporadic – was not continuous.

Or, perhaps better, French publications on the quantum were not continuous?

Surely, Kuhn would not have slipped-up in this regard? More likely he was deliberately adopting a loose use of 'continuous' as a kind of pun in contrast to the discrete quanta the publications discussed.

Taking a historical overview from a distance of some decades (like viewing the sand on the beach from a passing boat) the flow of publications about quanta might blur into seeming a continuous stream – perhaps this can be seen as a kind of inverse effect to how Kuhn suggests detailed historical scholarship could reveal the gradual, incremental progression in thinking that led to a revolutionary conceptual rupture.

In the history of science, as in the investigation of matter and energy, whether something appears as a continuum or not very much depends upon the scale at which we view and the grain size at which we sample. And much the same has been found in science education research exploring learners' developing ideas (Taber, 2008).

Coda: data are like energy

Dr Beth Malory, a lecturer in English Linguistics at UCL, commenting on the change in house style at The Financial Times referred to above, suggested that a corpus analysis of the occurrence of the word 'data' in accessible published texts indicates that "for most people, data is a mass noun" – that is, in general use people are much more likely to write "data is" instead of the formally correct "a datum is"/"data are". She made the following intriguing comparison,

"Something like energy or air would be a good analogy for it [data]."

Dr Beth Malory interviewed by Tim Harford on the BBC radio programme/podcast 'More or Less' episode 'Reoffending rates, Welsh taxes and the menopause' (Released On: 22 Feb 2023)

I doubt she had in mind that energy and air, like sand, and indeed like data themselves, are – if you investigate them closely enough – quantised.


Work cited

Notes

1 For scholars in the humanities, being widely criticised is not to be understood as an entirely negative thing. Primarily, it means people have noticed your work, and they think it is significant enough to be challenged: so many scholars would think that being widely criticised is to be welcomed! For that matter, having other scholars publish work criticising your scholarship can be a justification for seeking to publish a response to their criticisms and, so, an opportunity for developing and further disseminating your own ideas.

This might be seen as the academic equivalent of Oscar Wilde's quip that

the only thing worse than being talked about, was not being talked about!


2 Another example of a 'fudge factor come good' might be the 'cosmological constant' that Albert Einstein introduced into his theories to give the equations a form that fitted the assumed nature of the universe. Einstein later considered this his greatest blunder – but other physicists have found ways to interpret the cosmological constant as relating to important, observable features of the universe.


3 I suspect, often, data is actually being used as a collective noun (akin to the committee, the council, the population {nouns which refer to a (singular) group of people}; the swarm, the herd, the flock, the shoal, etc.), where 'the data' refers to the particular 'data set' ('dataset') being discussed.


4 Contributions to the literature are expected to build upon, and cite, existing work in a field – but each research report should set out a coherent and complete, self-contained argument to support its conclusions.

Read about research writing


Court TV: science in the media

Keith S. Taber


Images from Pixabay

I realised that there was something fascinating about the forensic nature of legal proceedings some years ago (1985) when I saw a television dramatisation of the tribunal into the death of Steve Bantu Biko in police custody in (then still apartheid) South Africa. Although this was a re-enactment, it used actual transcripts to present a reconstruction.

Although like most people I was disgusted with apartheid, I probably would not have known about Steve Biko if it had not been for Peter Gabriel's (1980) anthem ('Biko') protesting his killing – that was the hook that got me to take a look at the film. Although expecting I might find it dry or distressing – it was fascinating. Something that I intended to watch almost out of a sense of liberal duty was totally engrossing.


The cover of the single version of 'Biko' https://petergabriel.com/release/biko/

I  have recently spent some time looking at footage of the trial of former police office Derek Chauvin regarding the death of George Floyd. (A case which of course has parallels with Biko's death.)  This came from discovering I had access to a whole TV channel (currently) dedicated to showing the court case. I have largely moved from that (as I had less interest in all the commentary which added little to the court 'action'*) to reviewing some of the daily footage from the courtroom available on line.

[Read 'Do nerve signals travel faster than the speed of light?']

(* I was especially unimpressed by the trailers for forthcoming cases in U.S. States with the death penalty, where the show anchor gleefully told viewers we could watch verdicts where we would see the the accused as they found out if they were to live or die.)


Scene from inside the courtroom: Derek Chauvin represented by his attorney Eric Nelson

The application of science

In this particular case there is a good deal of physics, chemistry and biology (and indeed their interactions) being presented and argued over. I am not sure I would encourage children to watch (and certainly not to approach as an alternative form of entertainment) such serious proceedings – but any who are watching the expert testimony being presented may appreciate a lot about the nature of science (and in particular how data does not become evidence in isolation ). There is a potential counter here to all those TV shows where the whole history of the universe is unproblematically pieced together from some DNA collected by a detective offering a suspect a drink of water. (Okay, I exaggerate, if only a little.)

I initially, accidentally, fell upon coverage of pre-trial arguments about what evidence might be admissible in the forthcoming trial – and started to see how the defence may be offering a story quite inconsistent with the widely accepted narrative (based on the much shared film of the incident). I realised that despite thinking I was the kind of person who tries to always look at different perspectives and seek alternative understandings, and reserve judgement until it is due, I had (without being aware of it) already decided what had happened in this case, and in my own head the presumption of innocence had not really been applied.

This then led to me watching some of the footage of jury selection. This is a process I had been aware of, but had not considered in that much detail – and had certainly not fully appreciated why it might take so much time. After all, if I already had a pretty strong assumption of guilt, and I do not even live on the same continent, selecting people from the local area who have lived through the aftermath (protests and riots) and could put aside everything they had previously learnt, to focus purely on what was presented at trial, was not going to be easy.

There is a television programme, 'Would I lie to you?', where celebrities are given tall autobiographical tales to tell, some of which are true (though I suspect sometimes embellished in the telling) and where points are awarded to the two sides according to whether a person on one team misleads the other team into incorrectly determining 'truth or lie'. This came to mind in watching jury selection.


BBC Promotional shot for 'Would I lie to you?'

As the potential jurors were interviewed I found myself forming hunches of when the judge might excuse someone (who clearly was not going to be able to be fair to both sides) or when one of the attorneys might ask for a potential juror not to be selected to sit. Of course, there is a very big difference in nature between a popular entertainment show where some people act out the telling of a 'lie' (which is not really a lie, as there is no intention to mislead beyond the point of reveal within the game), and the very serious matter of jury selection, but the process of thinking about 'what will they think about this person's presentation?' in observing these different events seemed very similar.

Clearly, the Chauvin prosecution is a very high profile case, given the viral video of the incident and its importance (especially given its wider context as just one more in a continuing sequence of incidents with similar outcomes) in triggering worldwide condemnation of racism and the emergence of the Black Lives Matter movement.

However, my interest was piqued less by the highly charged public interest in this particular case (important as it is even in its own terms – a man has died in police 'care'; another may be incarcerated for 20 years or more depending on the verdict) than as the window into a real court. I was fascinated by aspects of the legal process. One feels very familiar with the U.S Court system though fictional works (Ally McBeal, The Good Wife/The Good Fight, etc.) but that is entertainment, and accounts of the legal process are very much condensed.

Interpreting data as evidence for a theory

I have always felt an interest in the law, and I think that is not surprising, given that the law acts as a set of formal guidelines (on process, and what is admissible and so forth), and the legal process is forensic, and evidence based, and adopts arguments to suggest how data can be construed as evidence within particular narratives of events.

This all seems parallel to science, and research more generally.

I have even used the law as an analogy in my teaching to suggest how one difference between the kind of theory-directed research which offers generalisable findings and is suitable for publication and context-directed research that may inform a practitioner's day-to-day decisions in the classroom as these can be seen to have a different burden of proof akin to the difference between criminal and civil courts (Taber, 2013). That is:

  • theory-directed research (claimed to be generalisable and worth reporting in the research literature) – should make its case beyond reason
  • context-directed research (such as action research carried out to address a local issue) – should make its case on the balance of probabilities, that is, (local) action should be informed by what the evidence suggests is most likely the case

Moreover, in many cases (and certainly very much in the Chauvin trial) science is heavily involved in making arguments and developing the cases for prosecution and/or defence.

The examination of witnesses in trials has a strong, if warped, parallel with research interviewing. The warping comes in because in research an interviewer should look to be unbiased and should be seeking 'the truth' as their informant understands it. In a trial, however, the lawyers for the two sides are each seeking to build a case, and so to ask questions that seek answers most in keeping with the scenario they are looking to establish as best representing the actual situation around an alleged crime.

So, there is much of interest in terms of how science is applied in expert testimony, but perhaps also some lessons from the advocates in how not to  do science by seeking to re-shape all the data to fit one's hypothesis.


Work cited: