Those flipping, confounding variables!

Keith S. Taber

Alternative interpretations and a study on flipped learning

Image by Please Don't sell My Artwork AS IS from Pixabay

Flipping learning

I was reading about a study of 'flipped learning'. Put very simply, the assumption behind flipped learning is that usually teaching follows a pattern of (a) class time spent with the teacher lecturing, followed by (b) students working through examples largely in their own time. This is a pattern that was (and perhaps still is) often found in Universities in subjects that largely teach though lecture courses.

The flipped learning approach switches the use the class time to 'active' learning activities, such as working through exercises, by having students undertake some study before class. That is, students learn about what would have been presented in the lecture by reading texts, watching videos, interacting with on-line learning resources, and so forth, BEFORE coming to class. The logic is that the teacher's input is more useful  when students are being challenged to apply the new ideas than as a means of presenting information.

That is clearly a quick gloss, and clearly much more could be said about the rationale, the assumptions behind the approach,and its implementation.

(Read more about flipped learning)

However, in simple terms, the mode of instruction for two stages of the learning process

  • being informed of scientific ideas (through a lecture)
  • applying those ideas (in unsupported private study)

are 'flipped' to

  • being informed of scientific ideas (through accessing learning resources)
  • applying those ideas (in a context where help and feedback is provided)

Testing pedagogy

So much for the intention, but does it work? That is where research comes in. If we want to test a hypothesis, such as 'students will learn more if learning is flipped' (or 'students will enjoy their studies more if learning is flipped', or 'more students will opt to study the subject further if learning is flipped', or whatever) then it would seem an experiment is called for.

In principle, experiments allow us to see if changing some factor (say, the sequence of activities in a course module) will change some variable (say, student scores on a test). The experiment is often the go-to methodology in natural sciences: modify one variable, and measure any change in another hypothesised to be affected by it, whilst keeping everything else that could conceivably have an influence constant. Even in science, however, it is seldom that simple, and experiments can never actually 'prove' our hypothesis is correct (or false).

(Read more about the scientific method)

In education, running experiments is even more challenging (Taber, 2019). Learners, classes, teachers, courses, schools, universities are not 'natural kinds'. That is, the kind of comparability you can expect between two copper sulphate crystals of a given mass, or two specimens of copper wire of given dimensions, does not apply: it can matter a lot whether you are testing this student or that student, or if the class is taught one teacher or another.

People respond to conditions different to inanimate objects – if testing the the conductivity of a sample of a salt solution of a given concentration it should not matter if it is Monday morning of Thursday afternoon, or whether it is windy outside, or which team lost last's night's match, or even whether the researcher is respectful or rude to the sample. Clearly when testing the motivation or learning of students, such things could influence measurements. Moreover, a sample of gas neither knows or cares what you are expecting to happen when you compress it, but people can be influenced by the expectations of researchers (so called expectancy effect – also known as the Pygmalion effect).

(Read about experimental research into teaching innovations)

Flipping the fundamentals of analytic chemistry

In the study, by Ponikwer and Patel, researchers flipped part of a module on the fundamentals of analytical chemistry, which was part of a BSc honours degree in biomedical science. The module was divided into three parts:

  1. absorbance and emission spectrosocopy
  2. chromatography and electrophoresis
  3. mass spectroscopy and nuclear magnetic resonance spectroscopy

Students were taught the first topics by the usual lectures, then the topics of chromatography and electrophoresis were taught 'flipped', before the final topics were taught through the usual lectures. This pattern was repeated over three successive years.

[Figure 1 in the paper offers a useful graphical representation of the study design. If I had been prepared to pay SpringerNature a fee, I would have been allowed to reproduce it here.*]

The authors of the study considered the innovation a success

This study suggests that flipped learning can be an effective model for teaching analytical chemistry in single topics and potentially entire modules. This approach provides the means for students to take active responsibility in their learning, which they can do at their own pace, and to conduct problem-solving activities within the classroom environment, which underpins the discipline of analytical chemistry. (Ponikwer & Patel,  2018: p.2268)

Confounding variables

Confounding variables are other factors which might vary between conditions and have an effect.

Read about confounding variables

Ponikwer and Patel were aware that one needs to be careful in interpreting the data collected in such a study. For example, it is not especially helpful to consider how well students did on the examination questions at the end of term to see if students did as well, or better, on the flipped topics that the other topics taught. Clearly students might find some topics, or indeed some questions, more difficult than others regardless of how they studied. Ponikwer and Patel reported that on average students did significantly better on questions from the flipped elements, but included important caveats

"This improved performance could be due to the flipped learning approach enhancing student learning, but may also be due to other factors, such as students finding the topic of chromatography more interesting or easier than spectroscopy, or that the format of flipped learning made students feel more positive about the subject area compared with those subject areas that were delivered traditionally." (Ponikwer & Patel,  2018: p.2267)

Whilst acknowledging such alternative explanations for their findings might seem to undermine their results it is good science to be explicit about such caveats. Looking for (and reporting) alternative explanations is a key part of the scientific attitude.

This good scientific practice is also clear where the authors discuss how attendance patterns varied over the course. The authors report that the attendance at the start of the flipped segment was similar to what had come before, but then attendance increased slightly during the flipped learning section of the course. They point out this shift was "not significant", that is statistics suggested it could not be ruled out to be a chance effect.

However Ponikwer and Patel do report a statistically "significant reduction in the attendance at the non-flipped lectures delivered after the flipped sessions" (p.2265) – that is, once students had experienced the flipped learning, on average they tended to attend normal lectures less later in their course. The authors suggest this could be a positive reaction to how they experienced the flipped learning, but again they point out that there were confounding variables, and other interpretations could not ruled out:

"This change in attendance may be due to increased engagement in the flipped learning module; however, it could also reflect a perception that a more exciting approach of lecturing or content is to be delivered. The enhanced level of engagement may also be because students could feel left behind in the problem-solving workshop sessions. The reduction in attendance after the flipped lecture may be due to students deciding to focus on assessments, feeling that they may have met the threshold attendance requirement" (Ponikwer & Patel,  2018: p.2265).

So, with these students, taking this particular course, in this particular university, having this sequence of topics based on some traditional and some flipped learning, there is some evidence of flipped learning better engaging students and leading to improved learning – but subject to a wide range of caveats which allow various alternative explanations of the findings.

(Read about caveats to research conclusions)

Pointless experiments?

Given the difficulties of interpreting experiments in education, one may wonder if there is any point in experiments in teaching and learning. On the other hand, for the lecturing staff on the course, it would seem strange to get these results, and dismiss them (it has not been proved that flipped learning has positive effects, but the results are at least suggestive and we can only base our action on the available evidence).

Moreover, Ponikwer and Patel collected other data, such as students' perceptions of the advantages and challenges of the flipped learning approach – data that can complement their statistical tests, and also inform potential modifications of the implementation of flipped learning for future iterations of the course.

(Read about the use of multiple research techniques in studies)

Is generalisation possible?

What does this tell us about the use of flipped learning elsewhere? Studies taking place in a single unique teaching and learning context do not automatically tell us what would have been the case elsewhere – with different lecturing staff, different demographic of students, when learning about marine ecology or general relativity. Such studies are best seen as context-directed, as being most relevant to here they are carried out.

However, again, even if research cannot be formally generalised, that does not mean that it cannot be informative to those working elsewhere who may apply a form of 'reader generalisation' to decide either:

a) that teaching and learning context seems very similar to ours: it might be worth trying that here;

or

b) that is a very different teaching and learning context to ours: it may not be worth the effort and disruption to try that out here based on the findings in such a different context.

(Read about generalisation)

This requires studies to give details of the teaching and learning context where they were carried out (so called 'thick description'). Clearly the more similar a study context is to one's own teaching context, and the wider the range of teaching and learning contexts where a particular pedagogy or teaching approach has been shown to have positive outcomes, the more reason there is to feel it is with trying something out in own's own classroom.

I have argued that:

"What are [common in the educational research literature] are individual small-scale experiments that cannot be considered to offer highly generalisable results. Despite this, where these individual studies are seen as being akin to case studies (and reported in sufficient detail) they can collectively build up a useful account of the range of application of tested innovations. That is, some inherent limitations of small-scale experimental studies can be mitigated across series of studies, but this is most effective when individual studies offer thick description of teaching contexts and when contexts for 'replication' studies are selected to best complement previous studies." (Taber, 2019: 106)

In that regard, studies like that of Ponikwer and Patel can be considered not as 'proof' of the effectiveness of flipped learning, but as part of a cumulative evidence base for the value of trying out the approach in various teaching situations.

Why I have not included the orignal figure showing the study design

* I had hoped to include in this post a copy of the figure in the paper showing the study design. The paper is not published open access and so the copyright in the 'design' (that, is the design of the figure **, not the study!) means that it cannot be legally reprodiced without permission. I sought permission to reproduce the figure here through (SpringerNature) the publisher's on line permissions request system, explaining this was to be used in an acdemics scholar's personal blog.

Springer granted permission for reuse, but subject to a fee of £53.83.

As copyright holder/managers they are perfectly entitled to do that. However, I had assumed that they would offer free use for a non-commercial purpose that offers free publicity to their publication. I have other uses for my pension, so I refer readers interested in seeing the figure to the original paper.

** Under the conventions associated with copyright law the reproduction of short extracts of an academic paper for the purposes of criticism and review is normally considered 'fair use' and exempt from copyright restrictions. However, any figure (or table) is treated as a discrete artistic design and cannot be copied from a work in copyright without permission.

(Read about copyright and scholarly works)

 

Work cited:

Author: Keith

Former school and college science teacher, teacher educator, research supervisor, and research methods lecturer. Emeritus Professor of Science Education at the University of Cambridge.

Leave a Reply

Your email address will not be published. Required fields are marked *