Why ask teachers to 'transmit' knowledge…

…if you believe that "knowledge is constructed in the minds of students"?


Keith S. Taber


While the students in the experimental treatment undertook open-ended enquiry, the learners in the control condition undertook practical work to demonstrate what they had already been told was the case – a rhetorical exercise that reflected the research study they were participating in


A team of researchers chose to compare a teaching approach they believed met the requirements for good science instruction, and which they knew had already been demonstrated effective pedagogy in other studies, with teaching they believed was not suitable for bringing about conceptual change.
(Ironically, they chose a research design more akin to the laboratory activities in the substandard control condition, than to the open-ended enquiry that was part of the pedagogy they considered effective!)

An imaginary conversation 1 with a team of science education researchers.

When we critically read a research paper, we interrogate the design of the study, and the argument for new knowledge claims that are being made. Authors of research papers need to anticipate the kinds of questions readers (editors, reviewers, and the wider readership on publication) will be asking as they try to decide if they find the study convincing.

Read about writing-up research

In effect, there is an asynchronous conversation.

Here I engage in 'an asynchronous conversation' with the authors of a research paper I was interrogating:

What was your study about?

"This study investigated the effect of the Science Writing Heuristic (SWH) approach on grade 9 students' understanding of chemical change and mixture concepts [in] a Turkish public high school."

Kingir, Geban & Gunel, 2013

I understand this research was set up as a quasi-experiment – what were the conditions being compared?

"Students in the treatment group were instructed by the SWH approach, while those in the comparison group were instructed with traditionally designed chemistry instruction."

Kingir, Geban & Gunel, 2013

Constructivism

Can you tell me about the theoretical perspective informing this study?

"Constructivism is increasingly influential in guiding student learning around the world. However, as knowledge is constructed in the minds of students, some of their commonsense ideas are personal, stable, and not congruent with the scientifically accepted conceptions… Students' misconceptions [a.k.a. alternative conceptions] and learning difficulties constitute a major barrier for their learning in various chemistry topics"

Kingir, Geban & Gunel, 2013

Read about constructivist pedagogy

Read about alternative conceptions

'Traditional' teaching versus 'constructivist' teaching

So, what does this suggest about so-called traditional teaching?

"Since prior learning is an active agent for student learning, science educators have been focused on changing these misconceptions with scientifically acceptable ideas. In traditional science teaching, it is difficult for the learners to change their misconceptions…According to the conceptual change approach, learning is the interaction between prior knowledge and new information. The process of learning depends on the degree of the integration of prior knowledge with the new information.2"

Kingir, Geban & Gunel, 2013

And does the Science Writing Heuristic Approach contrast to that?

"The Science Writing Heuristic (SWH) approach can be used to promote students' acquisition of scientific concepts. The SWH approach is grounded on the constructivist philosophy because it encourages students to use guided inquiry laboratory activities and collaborative group work to actively negotiate and construct knowledge. The SWH approach successfully integrates inquiry activities, collaborative group work, meaning making via argumentation, and writing-to-learn strategies…

The negotiation activities are the central part of the SWH because learning occurs through the negotiation of ideas. Students negotiate meaning from experimental data and observations through collaboration within and between groups. Moreover, the student template involves the structure of argumentation known as question, claim, and evidence. …Reflective writing scaffolds the integration of new ideas with prior learning. Students focus on how their ideas changed through negotiation and reflective writing, which helps them confront their misconceptions and construct scientifically accepted conceptions"

Kingir, Geban & Gunel, 2013

What is already known about SWH pedagogy?

It seems like the SWH approach should be effective at supporting student learning. So, has this not already been tested?

"There are many international studies investigating the effectiveness of the SWH approach over the traditional approach … [one team] found that student-written reports had evidence of their science learning, metacognitive thinking, and self-reflection. Students presented reasons and arguments in the meaning-making process, and students' self-reflections illustrated the presence of conceptual change about the science concepts.

[another team] asserted that using the SWH laboratory report format in lieu of a traditional laboratory report format was effective on acquisition of scientific conceptions, elimination of misconceptions, and learning difficulties in chemical equilibrium.

[Another team] found that SWH activities led to greater understanding of grade 6 science concepts when compared to traditional activities. The studies conducted at the postsecondary level showed similar results as studies conducted at the elementary level…

[In two studies] it was demonstrated that the SWH approach can be effective on students' acquisition of chemistry concepts. SWH facilitates conceptual change through a set of argument-based inquiry activities. Students negotiate meaning and construct knowledge, reflect on their own understandings through writing, and share and compare their personal meanings with others in a social context"

Kingir, Geban & Gunel, 2013

What was the point of another experimental test of SWH?

So, it seems that from a theoretical point of view, so-called traditional teaching is likely to be ineffective in bringing about conceptual learning in science, whilst a constructivist approach based on the Science Writing Heuristic is likely to support such learning. Moreover, you are aware of a range of existing studies which suggest that in practice the Science Writing Heuristic is indeed an effective basis for science teaching.

So, what was the point of your study?

"The present study aimed to investigate the effect of the SWH approach compared to traditional chemistry instruction on grade 9 students' understanding of chemical change and mixture concepts."

Kingir, Geban & Gunel, 2013

Okay, I would certainly accept that just because a teaching approach has been found effective with one age group, or in one topic, or in one cultural context, we cannot assume those findings can be generalised and will necessarily apply in other teaching contexts (Taber, 2019).

Read about generalisation from studies

What happened in the experimental condition?

So, what happened in the two classes taught in the experimental condition?

"The teacher asked students to form their own small groups (n=5) and introduced to them the SWH approach …they were asked to suggest a beginning question…, write a claim, and support that claim with evidence…

they shared their questions, claims, and evidence in order to construct a group question, claim, and evidence. …each group, in turn, explained their written arguments to the entire class. … the rest of the class asked them questions or refuted something they claimed or argued. …the teacher summarized [and then] engaged students in a discussion about questions, claims, and evidence in order to make students aware of the meaning of those words. The appropriateness of students' evidence for their claims, and the relations among questions, claims, and evidence were also discussed in the classroom…

The teacher then engaged students in a discussion about …chemical change. First, the teacher attempted to elicit students' prior understanding about chemical change through questioning…The teacher asked students to write down what they wanted to learn about chemical change, to share those items within their group, and to prepare an investigation question with a possible test and procedure for the next class. While students constructed their own questions and planned their testing procedure, the teacher circulated through the groups and facilitated students' thinking through questioning…

Each group presented their questions to the class. The teacher and the rest of the class evaluated the quality of the question in relation to the big idea …The groups' procedures were discussed and revised prior to the actual laboratory investigation…each group tested their own questions experimentally…The teacher asked each student to write a claim about what they thought happened, and support that claim with the evidence. The teacher circulated through the classroom, served as a resource person, and asked …questions

…students negotiated their individual claims and evidence within their groups, and constructed group claims and evidence… each group…presented … to the rest of the class."

Kingir, Geban & Gunel, 2013
What happened in the control condition?

Okay, I can see that the experimental groups experienced the kind of learning activities that both educational theory and previous research suggests are likely to engage them and develop their thinking.

So, what did you set up to compare with the Science Writing Heuristic Approach as a fair test of its effectiveness as a pedagogy?

"In the comparison group, the teacher mainly used lecture and discussion[3] methods while teaching chemical change and mixture concepts. The chemistry textbook was the primary source of knowledge in this group. Students were required to read the related topic from the textbook prior to each lesson….The teacher announced the goals of the lesson in advance, wrote the key concepts on the board, and explained each concept by giving examples. During the transmission of knowledge, the teacher and frequently used the board to write chemical formula[e] and equations and draw some figures. In order to ensure that all of the students understood the concepts in the same way, the teacher asked questions…[that] contributed to the creation of a discussion[3] between teacher and students. Then, the teacher summarized the concepts under consideration and prompted students to take notes. Toward the end of the class session, the teacher wrote some algorithmic problems [sic 4] on the board and asked students to solve those problems individually….the teacher asked a student to come to the board and solve a problem…

The …nature of their laboratory activities was traditional … to verify what students learned in the classroom. Prior to the laboratory session, students were asked to read the procedures of the laboratory experiment in their textbook. At the laboratory, the teacher explained the purpose and procedures of the experiment, and then requested the students to follow the step-by-step instructions for the experiment. Working in groups (n=5), all the students conducted the same experiment in their textbook under the direct control of the teacher. …

The students were asked to record their observations and data. They were not required to reason about the data in a deeper manner. In addition, the teacher asked each group to respond to the questions about the experiment included in their textbook. When students failed to answer those questions, the teacher answered them directly without giving any hint to the students. At the end of the laboratory activity, students were asked to write a laboratory report in traditional format, including purpose, procedure, observations and data, results, and discussion. The teacher asked questions and helped students during the activity to facilitate their connection of laboratory activity with what they learned in the classroom.

Kingir, Geban & Gunel, 2013

The teacher variable

Often in small scale research studies in education, a different teacher teaches each group and so the 'teacher variable' confounds the experiment (Taber, 2019). Here, however, you avoid that problem 5, as you had a sample of four classes, and two different teachers were involved, each teaching one class in each condition?

"In order to facilitate the proper instruction of the SWH approach in the treatment group, the teachers were given training sessions about its implementation prior to the study. The teachers were familiar with the traditional instruction. One of the teachers was teaching chemistry for 20 years, while the other was teaching chemistry for 22 years at a high school. The researcher also asked the teachers to teach the comparison group students in the same way they taught before and not to do things specified for the treatment group."

Kingir, Geban & Gunel, 2013

Was this research ethical?

As this is an imaginary conversation, not all of the questions I might like to ask are actually addressed in the paper. In particular, I would love to know how the authors would justify that their study was ethical, considering that the control condition they set up deliberately excluded features of pedagogy that they themselves claim are necessary to support effective science learning:

"In traditional science teaching, it is difficult for the learners to change their misconceptions"

The authors beleive that "learning occurs through the negotiation of ideas", and their experimental condition provides plenty of opportunity for that. The control condition is designed to avoid the explicit elicitation of learners' idea, dialogic talk, or peer interactions when reading, listening, writing notes or undertaking exercises. If the authors' beliefs are correct (and they are broadly consistent with a wide consensus across the global science education research community), then the teaching in the comparison condition is not suitable for facilitating conceptual learning.

Even if we think it is conceivable that highly experienced teachers, working in a national context where constructivist teaching has long been official education policy, had somehow previously managed to only teach in an ineffective way: was it ethical to ask these teachers to teach one of their classes poorly even after providing them with professional development enabling them to adopt a more engaging approach better aligned with our understanding of how science can be effectively taught?

Read about unethical control conditions

Given that the authors already believed that –

  • "Students' misconceptions and learning difficulties constitute a major barrier for their learning in various chemistry topics"
  • "knowledge is constructed in the minds of students"
  • "The process of learning depends on the degree of the integration of prior knowledge with the new information"
  • "learning occurs through the negotiation of ideas"
  • "The SWH approach successfully integrates inquiry activities, collaborative group work, meaning making" – A range of previous studies have shown that SWH effectively supports student learning

– why did they not test the SWH approach against existing good practice, rather than implement a control pedagogy they knew should not be effective, so setting up two classes of learners (who do not seem to have been asked to consent to being part of the research) to fail?

Read about the expectation for voluntary informed consent

Why not set up a genuinely informative test of the SWH pedagogy, rather than setting up conditions for manufacturing a forgone conclusion?


When it has already been widely established that a pedagogy is more effective than standard practice, there is little point further testing it against what is believed to be ineffective instruction.

Read about level of contol in experiments


How can it be ethical to ask teachers to teach in a way that is expected to be ineffective?

  • transmission of knowledge
  • follow the step-by-step instructions
  • not required to reason in a deeper manner
  • individual working

A rhetorical experiment?

Is this not just a 'rhetorical' experiment engineered to produce a desired outcome (a demonstration), rather than an open-ended enquiry (a genuine experiment)?

A rhetorical experiment is not designed to produce substantially new knowledge: but rather to create the conditions for a 'positive' result (Figure 8 from Taber, 2019).

Read about rhetorical experiments


A technical question

Any study of a teaching innovation requires the commitment of resources and some disruption of teaching. Therefore any research study which has inherent design faults that will prevent it producing informative outcomes can be seen as a misuse of resources, and an unproductive disruption of school activities, and so, if only in that sense, unethical.

As the research was undertaken with "four intact classes" is it possible to apply any statistical tests that can offer meaningful results, when there are only two units of analysis in each condition? [That is, I think not.]

The researchers claim to have 117 degrees of freedom when applying statistical tests to draw conclusions. They seem to assume that each of the 122 children can be considered to be a separate unit of analysis. But is it reasonable to assume that c.30 children taught together in the same intact class by the same teacher (and working in groups for at least part of the time) are independently experiencing the (experimental or control) treatment?

Surely, the students within a class influence each other's learning (especially during group-work), so the outcomes of statistical tests that rely on treating each learner as an independent unit of analysis are invalid (Taber, 2019). This is especially so in the experimental treatment where dialogue (and "the negotiation of ideas") through group-work, discussion, and argumentation were core parts of the instruction.

Read about units of analysis

Sources cited:

  • Ausubel, D. P. (1968). Educational Psychology: A cognitive view. Holt, Rinehart & Winston.
  • Kingir, S., Geban, O., & Gunel, M. (2013). Using the Science Writing Heuristic Approach to Enhance Student Understanding in Chemical Change and Mixture. Research in Science Education, 43(4), 1645-1663. https://doi.org/10.1007/s11165-012-9326-x
  • Taber, K. S. (2019). Experimental research into teaching innovations: responding to methodological and ethical challengesStudies in Science Education, 55(1), 69-119. doi:10.1080/03057267.2019.1658058 [Download]

Notes:

1 I have used direct quotes from the published report in Research in Science Education (but I have omitted citations to other papers), with some emphasis added. Please refer to the full report of the study for further details. I have attempted to extract relevant points from the paper to develop an argument here. I have not deliberately distorted the published account by selection and/or omission, but clearly am only reproducing small extracts. I would recommend readers might access the original study in order to make up their own minds.


2 The next statement is "If individuals know little about the subject matter, new information is easily embedded in their cognitive structure (assimilation)." This is counter to the common thinking that learning about an unfamiliar topic is more difficult, and learning is made meaningful when it can be related to prior knowledge (Ausubel, 1968).

Read about making the unfamiliar familiar


3 The term 'discussion' might suggest an open-ended exchange of ideas and views. This would be a dialogic technique typical of constructivist approaches. From the wider context its seems likely something more teacher-directed and closed than this was meant here – but this is an interpretation which goes beyond the description available in the original text.

Read about dialogic learning


4 Researchers into problem-solving consider that a problem has to require a learner to do more that simply recall and apply previously learned knowledge and techniques – so an 'algorithmic problem' might be considered an oxymoron. However, it is common for teachers to refer to algorithmic exercises as 'problems' even though they do not require going beyond application of existing learning.


5 This design does avoid the criticism that one of the teacher may have just been more effective at teaching the topic to this age group, as both teachers teach in both conditions.

This does not entirely remove potential confounds as teachers interact differently with different classes, and with only four teacher-class combinations it could well be that there is better rapport in the two classes in one or other condition. It is very hard to see how this can be addressed (except by having a large enough sample of classes to allow inferential statistics to be used rigorously – which is not feasible in small scale studies).

A potentially more serious issue is 'expectancy' effects. There is much research in education and other social contexts to show that people's beliefs and expectations influence outcomes of studies – and this can make a substantial difference. If the two teachers were unconvinced by the newfangled and progressive approach being tested, then this could undermine their ability to effectively teach that way.

On the other hand, although it is implied that these teachers normally teach in the 'traditional' way, actually constructivist approaches are recommended in Turkey, and are officially sanctioned, and widely taught in teacher education and development courses. If the teachers accepted the arguments for believing the SWH was likely to be more effective at bringing about conceptual learning than the methods they were asked to adopt in the comparison classes, that would further undermine that treatment as a fair control condition.

Read about expectancy effects in research

Again, there is very little researchers can do about this issue as they cannot ensure that teachers participating in research studies are equally confident in the effectivenes of different treatments (and why should they be – the researchers are obviously expecting a substantive difference*), and this is a major problem in studies into teaching innovations (Taber, 2019).

* This is clear from their paper. Is it likely that they would have communicated this to the teachers? "The teachers were given training sessions about [SWH's] implementation prior to the study." Presumably, even if somehow these experienced teachers had previously managed to completely avoid or ignore years of government policy and guidance intending to persuade them of the value of constructivist approaches, the researchers could not have offered effective "training sessions" without explaining the rationales of the overall approach, and for the specific features of the SWH that they wanted teachers to adopt.


A case of hybrid research design?

When is "a case study" not a case study? Perhaps when it is (nearly) an experiment?

Keith S. Taber

I read this interesting study exploring learners shifting conceptions of the particulate nature of gases.

Mamombe, C., Mathabathe, K. C., & Gaigher, E. (2020). The influence of an inquiry-based approach on grade four learners' understanding of the particulate nature of matter in the gaseous phase: a case study. EURASIA Journal of Mathematics, Science and Technology Education, 16(1), 1-11. doi:10.29333/ejmste/110391

Key features:

  • Science curriculum context: the particulate nature of matter in the gaseous phase
  • Educational context: Grade 4 students in South Africa
  • Pedagogic context: Teacher-initiated inquiry approach (compared to a 'lecture' condition/treatment)
  • Methodology: "qualitative pre-test/post-test case study design" – or possibly a quasi-experiment?
  • Population/sample: the sample comprised 116 students from four grade four classes, two from each of two schools

This study offers some interesting data, providing evidence of how students represent their conceptions of the particulate nature of gases. What most intrigued me about the study was its research design, which seemed to reflect an unusual hybrid of quite distinct methodologies.

In this post I look at whether the study is indeed a case study as the authors suggest, or perhaps a kind of experiment. I also make some comments about the teaching model of the states of matter presented to the learners, and raise the question of whether the comparison condition (lecturing 8-9 year old children about an abstract scientific model) is appropriate, and indeed ethical.

Learners' conceptions of the particulate nature of matter

This paper is well worth reading for anyone who is not familiar with existing research (such as that cited in the paper) describing how children make sense of the particulate nature of matter, something that many find counter-intuitive. As a taster for this, I reproduce here two figures from the paper (which is published open access under a creative common license* that allows sharing and adaption of copyright material with due acknowledgement).

Figures © 2020 by the authors of the cited paper *

Conceptions are internal, and only directly available to the epistemic subject, the person holding the conception. (Indeed, some conceptions may be considered implicit, and so not even available to direct introspection.) In research, participants are asked to represent their understandings in the external 'public space' – often in talk, here by drawing (Taber, 2013). The drawings have to be interpreted by the researchers (during data analysis). In this study the researchers also collected data from group work during learning (in the enquiry condition) and by interviewing students.

What kind of research design is this?

Mamombe and colleagues describe their study as "a qualitative pre-test/post-test case study design with qualitative content analysis to provide more insight into learners' ideas of matter in the gaseous phase" (p. 3), yet it has many features of an experimental study.

The study was

"conducted to explore the influence of inquiry-based education in eliciting learners' understanding of the particulate nature of matter in the gaseous phase"

p.1

The experiment compared two pedagogical treatments :

  • "inquiry-based teaching…teacher-guided inquiry method" (p.3) guided by "inquiry-based instruction as conceptualized in the 5Es instructional model" (p.5)
  • "direct instruction…the lecture method" (p.3)

These pedagogic approaches were described:

"In the inquiry lessons learners were given a lot of materials and equipment to work with in various activities to determine answers to the questions about matter in the gaseous phase. The learners in the inquiry lessons made use of their observations and made their own representations of air in different contexts."

"the teacher gave probing questions to learners who worked in groups and constructed different models of their conceptions of matter in the gaseous phase. The learners engaged in discussion and asked the teacher many questions during their group activities. Each group of learners reported their understanding of matter in the gaseous phase to the class"

p.5, p.1

"In the lecture lessons learners did not do any activities. They were taught in a lecturing style and given all the notes and all the necessary drawings.

In the lecture classes the learners were exposed to lecture method which constituted mainly of the teacher telling the learners all they needed to know about the topic PNM [particulate nature of matter]. …During the lecture classes the learners wrote a lot of notes and copied a lot of drawings. Learners were instructed to paste some of the drawings in their books."

pp.5-6

The authors report that,

"The learners were given clear and neat drawings which represent particles in the gaseous, liquid and solid states…The following drawing was copied by learners from the chalkboard."

p.6
Figure used to teach learners in the 'lecture' condition. Figure © 2020 by the authors of the cited paper *
A teaching model of the states of matter

This figure shows increasing separation between particles moving from solid to liquid to gas. It is not a canonical figure, in that the spacing in a liquid is not substantially greater than in a solid (indeed, in ice floating on water the spacing is greater in the solid), whereas the difference in spacing in the two fluid states is under-represented.

Such figures do not show the very important dynamic aspect: that in a solid particles can usually only oscillate around a fixed position (a very low rate of diffusion not withstanding), where in a liquid particles can move around, but movement is restricted by the close arrangement of (and intermolecular forces between) the particles, where in a gas there is a significant mean free path between collisions where particles move with virtually constant velocity. A static figure like this, then, does not show the critical differences in particle interactions which are core to the basic scientific model

Perhaps even more significant, figure 2 suggests there is the same level of order in the three states, whereas the difference in ordering between a solid and liquid is much more significant than any change in particle spacing.

In teaching, choices have to be made about how to represent science (through teaching models) to learners who are usually not ready to take on board the full details and complexity of scientific knowledge. Here, Figure 2 represents a teaching model where it has been decided to emphasise one aspect of the scientific model (particle spacing) by distorting the canonical model, and to neglect other key features of the basic scientific account (particle movement and arrangement).

External teachers taught the classes

The teaching was undertaken by two university lecturers

"Two experienced teachers who are university lecturers and well experienced in teacher education taught the two classes during the intervention. Each experienced teacher taught using the lecture method in one school and using the teacher-guided inquiry method in the other school."

p.3

So, in each school there was one class taught by each approach (enquiry/lecture) by a different visiting teacher, and the teachers 'swapped' the teaching approaches between schools (a sensible measure to balance possible differences between the skills/styles of the two teachers).

The research design included a class in each treatment in each of two schools

An experiment; or a case study?

Although the study compared progression in learning across two teaching treatments using an analysis of learner diagrams, the study also included interviews, as well as learners' "notes during class activities" (which one would expect would be fairly uniform within each class in the 'lecture' treatment).

The outcome

The authors do not consider their study to be an experiment, despite setting up two conditions for teaching, and comparing outcomes between the two conditions, and drawing conclusions accordingly:

"The results of the inquiry classes of the current study revealed a considerable improvement in the learners' drawings…The results of the lecture group were however, contrary to those of the inquiry group. Most learners in the lecture group showed continuous model in their post-intervention results just as they did before the intervention…only a slight improvement was observed in the drawings of the lecture group as compared to their pre-intervention results"

pp.8-9

These statements can be read in two ways – either

  • a description of events (it just happened that with these particular classes the researchers found better outcomes in the enquiry condition), or
  • as the basis for a generalised inference.

An experiment would be designed to test a hypothesis (this study does not seem to have an explicit hypothesis, nor explicit research questions). Participants would be assigned randomly to conditions (Taber, 2019), or, at least, classes would be randomly assigned (although then strictly each class should be considered as a single unit of analysis offering much less basis for statistical comparisons). No information is given in the paper on how it was decided which classes would be taught by which treatment.

Representativeness

A study could be carried out with the participation of a complete population of interest (e.g., all of the science teachers in one secondary school), but more commonly a sample is selected from a population of interest. In a true experiment, the sample has to be selected randomly from the population (Taber, 2019) which is seldom possible in educational studies.

The study investigated a sample of 'grade four learners'

In Mamombe and colleagues' study the sample is described. However, there is no explicit reference to the population from which the sample is drawn. Yet the use of the term 'sample' (rather than just, say, 'participants') implies that they did have a population in mind.

The aim of the study is given as to "to explore the influence of inquiry-based education in eliciting learners' understanding of the particulate nature of matter in the gaseous phase" (p.1) which could be considered to imply that the population is 'learners'. The title of the paper could be taken to suggest the population of interests is more specific: "grade four learners". However, the authors make no attempt to argue that their sample is representative of any particular population, and therefore have no basis for statistical generalisation beyond the sample (whether to learners, or to grade four learners, or to grade four learners in RSA, or to grade four learners in farm schools in RSA, or…).

Indeed only descriptive statistics are presented: there is no attempt to use tests of statistical significance to infer whether the difference in outcomes between conditions found in the sample would probably have also been found in the wider population.

(That is inferential stats. are commonly used to suggest 'we found a statistically significant better outcome in one condition in our sample, so in the hypothetical situation that we had been able to include the entire population in out study we would probably have found better mean outcomes in that same condition'.)

This may be one reason why Mamombe and colleagues do not consider their study to be an experiment. The authors acknowledge limitations in their study (as there always are in any study) including that "the sample was limited to two schools and two science education specialists as instructors; the results should therefore not be generalized" (p.9).

Yet, of course, if the results cannot be generalised beyond these four classes in two schools, this undermines the usefulness of the study (and the grounds for the recommendations the authors make for teaching based on their findings in the specific research contexts).

If considered as an experiment, the study suffers from other inherent limitations (Taber, 2019). There were likely novelty effects, and even though there was no explicit hypothesis, it is clear that the authors expected enquiry to be a productive approach, so expectancy effects may have been operating.

Analytical framework

In an experiment is it important to have an objective means to measure outcomes, and this should be determined before data are collected. (Read about 'Analysis' in research studies.). In this study methods used in previous published work were adopted, and the authors tell us that "A coding scheme was developed based on the findings of previous research…and used during the coding process in the current research" (p.6).

But they then go on to report,

"Learners' drawings during the pre-test and post-test, their notes during class activities and their responses during interviews were all analysed using the coding scheme developed. This study used a combination of deductive and inductive content analysis where new conceptions were allowed to emerge from the data in addition to the ones previously identified in the literature"

p.6

An emerging analytical frame is perfectly appropriate in 'discovery' research where a pre-determined conceptualisation of how data is to be understood is not employed. However in 'confirmatory' research, testing a specific idea, the analysis is operationalised prior to collecting data. The use of qualitative data does not exclude a hypothesis-testing, confirmatory study, as qualitative data can be analysed quantitatively (as is done in this study), but using codes that link back to a hypothesis being tested, rather than emergent codes. (Read about 'Approaches to qualitative data analysis'.)

Much of Mamombe and colleagues' description of their work aligns with an exploratory discovery approach to enquiry, yet the gist of the study is to compare student representations in relation to a model of correct/acceptable or alternative conceptions to test the relative effectiveness of two pedagogic treatments (i.e., an experiment). That is a 'nomothetic' approach that assumed standard categories of response.

Overall, the author's account of how they collected and analysed data seem to suggest a hybrid approach, with elements of both a confirmatory approach (suitable for an experiment) and elements of a discovery approach (more suitable for case study). It might seem this is a kind of mixed methods study with both confirmatory/nomothetic and discovery/idiographic aspects – responding to two different types of research question the same study.

Yet there do not actually seem (**) to be two complementary strands to the research (one exploring the richness of student's ideas, the other comparing variables – i.e., type of teaching versus degree of learning), but rather an attempt to hybridise distinct approaches based on incongruent fundamental (paradigmatic) assumptions about research. (** Having explicit research questions stated in the paper could have clarified this issue for a reader.)

So, do we have a case study?

Mamombe and colleagues may have chosen to frame their study as a kind of case study because of the issues raised above in regard to considering it an experiment. However, it is hard to see how it qualifies as case study (even if the editor and peer reviewers of the EURASIA Journal of Mathematics, Science and Technology Education presumably felt this description was appropriate).

Mamombe and colleagues do use multiple data sources, which is a common feature of case study. However, in other ways the study does not meet the usual criteria for case study. (Read more about 'Case study'.)

For one thing, case study is naturalistic. The method is used to study a complex phenomena (e.g., a teacher teaching a class) that is embedded in a wider context (e.g., a particular school, timetable, cultural context, etc.) such that it cannot be excised for clinical examination (e.g., moving the lesson to a university campus for easy observation) without changing it. Here, there was an intervention, imposed from the outside, with external agents acting as the class teachers.

Even more fundamentally – what is the 'case'?

A case has to have a recognisable ('natural') boundary, albeit one that has some permeability in relation to its context. A classroom, class, year group, teacher, school, school district, etcetera, can be the subject of a case study. Two different classes in one school, combined with two other classes from another school, does not seem to make a bounded case.

In case study, the case has to be defined (not so in this study); and it should be clear it is a naturally occurring unit (not so here); and the case report should provide 'thick description' (not provided here) of the case in its context. Mamombe and colleagues' study is simply not a case study as usually understood: not a "qualitative pre-test/post-test case study design" or any other kind of case study.

That kind of mislabelling does not in itself does not invalidate research – but may indicate some confusion in the basic paradigmatic underpinnings of a study. That seems to be the case [sic] here, as suggested above.

Suitability of the comparison condition: lecturing

A final issue of note about the methodology in this study is the nature of one of the two conditions used as a pedagogic treatment. In a true experiment, this condition (against which the enquiry condition was contrasted) would be referred to as the control condition. In a quasi-experiment (where randomisation of participants to conditions is not carried out) this would usually be referred to as the comparison condition.

At one point Mamombe and colleagues refer to this pedagogic treatment as 'direct instruction' (p.3), although this term has become ambiguous as it has been shown to mean quite different things to different authors. This is also referred to in the paper as the lecture condition.

Is the comparison condition ethical?

Parental consent was given for students contributing data for analysis in the study, but parents would likely trust the professional judgement of the researchers to ensure their children were taught appropriately. Readers are informed that "the learners whose parents had not given consent also participated in all the activities together with the rest of the class" (p.3) so it seems some children in the lecture treatment were subject to the inferior teaching approach despite this lack of consent, as they were studying "a prescribed topic in the syllabus of the learners" (p.3).

I have been very critical of a certain kind of 'rhetorical' research (Taber, 2019) report which

  • begins by extolling the virtues of some kind of active / learner-centred / progressive / constructivist pedagogy; explaining why it would be expected to provide effective teaching; and citing numerous studies that show its proven superiority across diverse teaching contexts;
  • then compares this with passive modes of learning, based on the teacher talking and giving students notes to copy, which is often characterised as 'traditional' but is said to be ineffective in supporting student learning;
  • then describes how authors set up an experiment to test the (superior) pedagogy in some specific context, using as a comparison condition the very passive learning approach they have already criticised as being ineffective as supporting learning.

My argument is that such research is unethical

  • It is not genuine science as the researchers are not testing a genuine hypothesis, but rather looking to demonstrate something they are already convinced of (which does not mean they could not be wrong, but in research we are trying to develop new knowledge).
  • It is not a proper test of the effectiveness of the progressive pedagogy as it is being compared against a teaching approach the authors have already established is sub-standard.

Most critically, young people are subjected to teaching that the researchers already believe they know will disadvantage them, just for the sake of their 'research', to generate data for reporting in a research journal. Sadly, such rhetorical studies are still often accepted for publication despite their methodological weaknesses and ethical flaws.

I am not suggesting that Mamombe, Mathabathe and Gaigher have carried out such a rhetorical study (i.e., one that poses a pseudo-question where from the outset only one outcome is considered feasible). They do not make strong criticisms of the lecturing approach, and even note that it produces some learning in their study:

"Similar to the inquiry group, the drawings of the learners were also clearer and easier to classify after teaching"

"although the inquiry method was more effective than the lecture method in eliciting improved particulate conception and reducing continuous conception, there was also improvement in the lecture group"

p.9, p.10

I have no experience of the South African education context, so I do not know what is typical pedagogy in primary schools there, nor the range of teaching approaches that grade 4 students there might normally experience (in the absence of external interventions such as reported in this study).

It is for the "two experienced teachers who are university lecturers and well experienced in teacher education" (p.3) to have judged whether a lecture approach based on teacher telling, children making notes and copying drawings, but with no student activities, can be considered an effective way of teaching 8-9 year old children a highly counter-intuitive, abstract, science topic. If they consider this good teaching practice (i.e., if it is the kind of approach they would recommend in their teacher education roles) then it is quite reasonable for them to have employed this comparison condition.

However, if these experienced teachers and teacher educators, and the researchers designing the study, considered that this was poor pedagogy, then there is a real question for them to address as to why they thought it was appropriate to implement it, rather than compare the enquiry condition with an alternative teaching approach that they would have expected to be effective.

Sources cited:

* Material reproduced from Mamombe, Mathabathe & Gaigher, 2020 is © 2020 licensee Modestum Ltd., UK. That article is an open access article distributed under the terms and conditions of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) [This post, excepting that material, is © 2020, Keith S. Taber.]

An introduction to research in education:

Taber, K. S. (2013). Classroom-based Research and Evidence-based Practice: An introduction (2nd ed.). London: Sage.