A concept cartoon to explore learner thinking


Keith S. Taber


I have designed a simple concept cartoon. Concept cartoons are used in teaching, usually as an introductory activity to elicit students' ideas about a topic before proceeding to develop the scientific account. This can be seen as 'diagnostic assessment' or just part of good pedagogy when teaching topics where learners are likely to have alternative conceptions. (So, in science teaching, that means just about any topic!)

Read about concept cartoons

But I am retired and no longer teach classes, so why am I spending my time preparing teaching resources?

Well, I was writing about dialogic teaching, and so devised an outline lesson plan to illustrate what dialogic teaching might look like. The introductory activity was to be a concept cartoon, so I thought I should specify what it might contain – and so then I thought it would help a reader if I actually mocked up the cartoon so it would be clear what I was writing about. That led to:


A concept cartoon provides learners with several competing ideas to discuss (This can be downloaded below)


What happens, and why?

In my concept cartoon the focal question is what will happen when some NaCl is added to water – and why? This is a concept cartoon because there are several characters offering competing ideas to act as foci for learners to discuss and explore. Of course, it is possible to ask learners to engage with a cartoon individually, but they are intended to initiate dialogue between learners. So by talking together learners will each have an audience to ask them to clarify, and to challenge, their thinking and to ensure they try to explain their reasoning.

Of course, there is flexibility in how they can be used. A teacher could ask students to consider the cartoon individually, before moving to small group discussions or whole class discussion work. (It is also possible to move from individual work to pairing up, to forming groups from two pairs, to the teacher then collating ideas from different groups.) During this stage of activity the intention is to let student make their thinking explicit and to consider and compare different views.

Of course, this is a prelude to the teacher persuading everyone in the class of the right answer, and why it is the right answer. Concept cartoons are used where we know student thinking is likely to make that stage more than trivial. Where learners do already have well-entrenched conceptions at odds with the scientific models, we know simply telling them the target curriculum account is unlikely to lead to long-term shifts in their thinking.

And even if they do not, they will be more likely to appreciate, and later recall, the scientific account if the ground is prepared in this way by engaging students with the potential 'explanatory landscape' (thinking about what is to be explained, and what explanation might look like). If they become genuinely engaged with the question then the teacher's presentation of the science is given 'epistemic relevance'. (Inevitably the science curriculum consists of answers to the questions scientists have posed over many years: but in teaching it we may find we are presenting answers to many questions that simply have never occurred to the students. If we can get learners to first wonder about the questions, then that makes the answer more relevant for them – so more likely to be remembered later.)

Is there really likely to be a diversity of opinion?

This example may seem fairly straightforward to a science teacher. Clearly NaCl, sodium chloride (a.k.a. 'common salt' or 'table salt') is an ionic solid that will dissolve in water as the ions are solvated by the polar water molecules clustering around them. That should also be obvious to advanced students. (Shouldbut research evidence suggests not always.)

What about students who have just learned about ionic bonding and the NaCl crystal structure? What might they think?

Surely, we can dismiss the possibility that salt will not dissolve? Everyone knows it does. The sea is pretty salty, and people often add salt to the water when cooking. And as long as learners know that NaCl is 'salt' there should be no one supporting the option that it does not dissolve. After all, there is a very simple logical syllogism to be applied here:

  • common salt dissolves in water
  • common salt is NaCl
  • so NaCl dissolves in water

Except, of course, learners who know both that salt dissolves in water and that it is NaCl still have to bring both of those points to mind, and coordinate them – and if they are juggling other information at the same time they may have reached the 'working memory capacity' limit.

Moreover, we know that often learners tend to 'compartmentalise' their learning (well, we all do to some extent), so although they may engage with salt in the kitchen or dinner table, and learn about salt as NaCl in science lessons, they may not strongly link these two domains. And the rationale offered here by the student in red, that NaCl is strongly bonded, is a decent reason to expect the salt to be insoluble.

Now as I have just made this cartoon up, and do not have any classes to try it out on, I may be making a misjudgement and perhaps no learners would support this option. But I have a sneaking suspicion there might be a few who would!

The other two options are based on things I was told when a teacher. That the solid may dissolve as separate atoms is based on being told by an advanced student that in 'double decomposition' reactions the precipitate was produced when atoms in the solution paired up to transfer electrons. The student knew the solutions reacting (say of potassium iodide and lead nitrate) contained ions, but obviously (to my informant) the ions changed themselves back into atoms before forming new ionic bonds by new electron transfers.

I was quite shocked to have been told that, but perhaps should not have been as it involves two very common misconceptions:

(Moreover, another advanced student once told me that when bonds broke electrons had to go back to their 'own' atom as it would be odd for an atom to end up with someone else's electron! So, by this logic, of course anions have to return electrons to their rightful owners before ironically bonding elsewhere!)

So, I suspect a fair number of students new to learning about ionic bonding might well expect it to dissolve as atoms rather than ions.

As regards the other option, that the salt dissolves as molecules, I would actually be amazed if quite a few learners in most classes of, say, 13-14-year-olds, did not select this option. It is very common for students to think that, despite its symmetrical crystal structure (visible in the model in the cartoon), NaCl really comprises of NaCl units, molecule-like ions pairs – perhaps even seen as simply NaCl 'molecules'.

It becomes the teacher's job to persuade learners this is not so, for example, by considering how much energy is needed to melt NaCl , and the conductivity of the liquid and the aqueous solution. (In my imaginary lesson the next activity was a 'Predict-Observe-Explain' activity involving measuring the conductivity of a salt solution.)


A challenge to science teachers

Perhaps you think the students in your classes would not find this a challenging task, as you have taught them that NaCl is an ionic solid, held together by the attractions between cations and anions? All your students know NaCl dissolves, and that the dissolved species will (very nearly always) be single hydrated ions.

Perhaps you are right, and I am wrong.

Or perhaps you recognise that given that in the past so many students have demonstrated alternative conceptions of ionic bonding (Taber, 1994) that perhaps some of your own students may find this topic difficult.

As I no longer had classes to teach, I am uploading a copy of the cartoon that can be downloaded in case you want to present this to your classes and see how they get on. This is primary for students who have been introduced to ionic bonding and taught that salts such as NaCl form solids with regular arrangements of charged ions. If they have not yet studied salts dissolving then perhaps this would be a useful introductory ability for that learning that content?

If you have already taught them about salts dissolving, then obviously they should all get the right answer. (But does that mean they will? Is it worth five minutes of class-time to check?)

And if you work with more advanced students who are expected to have mastered ionic bonding some years ago, then we might hope no one in the class would hesitate in selecting the right answer. (But can you be sure? You could present this as something designed for younger students, and ask your students how they would tutor a younger bother or sister who was not sure what the right answer was.)

If you do decide to try this out with your students – I would really like to know how you get on. Perhaps you would even share your experience with other readers by leaving a comment below?



Work cited:


Why ask teachers to 'transmit' knowledge…

…if you believe that "knowledge is constructed in the minds of students"?


Keith S. Taber


While the students in the experimental treatment undertook open-ended enquiry, the learners in the control condition undertook practical work to demonstrate what they had already been told was the case – a rhetorical exercise that reflected the research study they were participating in


A team of researchers chose to compare a teaching approach they believed met the requirements for good science instruction, and which they knew had already been demonstrated effective pedagogy in other studies, with teaching they believed was not suitable for bringing about conceptual change.
(Ironically, they chose a research design more akin to the laboratory activities in the substandard control condition, than to the open-ended enquiry that was part of the pedagogy they considered effective!)

An imaginary conversation 1 with a team of science education researchers.

When we critically read a research paper, we interrogate the design of the study, and the argument for new knowledge claims that are being made. Authors of research papers need to anticipate the kinds of questions readers (editors, reviewers, and the wider readership on publication) will be asking as they try to decide if they find the study convincing.

Read about writing-up research

In effect, there is an asynchronous conversation.

Here I engage in 'an asynchronous conversation' with the authors of a research paper I was interrogating:

What was your study about?

"This study investigated the effect of the Science Writing Heuristic (SWH) approach on grade 9 students' understanding of chemical change and mixture concepts [in] a Turkish public high school."

Kingir, Geban & Gunel, 2013

I understand this research was set up as a quasi-experiment – what were the conditions being compared?

"Students in the treatment group were instructed by the SWH approach, while those in the comparison group were instructed with traditionally designed chemistry instruction."

Kingir, Geban & Gunel, 2013

Constructivism

Can you tell me about the theoretical perspective informing this study?

"Constructivism is increasingly influential in guiding student learning around the world. However, as knowledge is constructed in the minds of students, some of their commonsense ideas are personal, stable, and not congruent with the scientifically accepted conceptions… Students' misconceptions [a.k.a. alternative conceptions] and learning difficulties constitute a major barrier for their learning in various chemistry topics"

Kingir, Geban & Gunel, 2013

Read about constructivist pedagogy

Read about alternative conceptions

'Traditional' teaching versus 'constructivist' teaching

So, what does this suggest about so-called traditional teaching?

"Since prior learning is an active agent for student learning, science educators have been focused on changing these misconceptions with scientifically acceptable ideas. In traditional science teaching, it is difficult for the learners to change their misconceptions…According to the conceptual change approach, learning is the interaction between prior knowledge and new information. The process of learning depends on the degree of the integration of prior knowledge with the new information.2"

Kingir, Geban & Gunel, 2013

And does the Science Writing Heuristic Approach contrast to that?

"The Science Writing Heuristic (SWH) approach can be used to promote students' acquisition of scientific concepts. The SWH approach is grounded on the constructivist philosophy because it encourages students to use guided inquiry laboratory activities and collaborative group work to actively negotiate and construct knowledge. The SWH approach successfully integrates inquiry activities, collaborative group work, meaning making via argumentation, and writing-to-learn strategies…

The negotiation activities are the central part of the SWH because learning occurs through the negotiation of ideas. Students negotiate meaning from experimental data and observations through collaboration within and between groups. Moreover, the student template involves the structure of argumentation known as question, claim, and evidence. …Reflective writing scaffolds the integration of new ideas with prior learning. Students focus on how their ideas changed through negotiation and reflective writing, which helps them confront their misconceptions and construct scientifically accepted conceptions"

Kingir, Geban & Gunel, 2013

What is already known about SWH pedagogy?

It seems like the SWH approach should be effective at supporting student learning. So, has this not already been tested?

"There are many international studies investigating the effectiveness of the SWH approach over the traditional approach … [one team] found that student-written reports had evidence of their science learning, metacognitive thinking, and self-reflection. Students presented reasons and arguments in the meaning-making process, and students' self-reflections illustrated the presence of conceptual change about the science concepts.

[another team] asserted that using the SWH laboratory report format in lieu of a traditional laboratory report format was effective on acquisition of scientific conceptions, elimination of misconceptions, and learning difficulties in chemical equilibrium.

[Another team] found that SWH activities led to greater understanding of grade 6 science concepts when compared to traditional activities. The studies conducted at the postsecondary level showed similar results as studies conducted at the elementary level…

[In two studies] it was demonstrated that the SWH approach can be effective on students' acquisition of chemistry concepts. SWH facilitates conceptual change through a set of argument-based inquiry activities. Students negotiate meaning and construct knowledge, reflect on their own understandings through writing, and share and compare their personal meanings with others in a social context"

Kingir, Geban & Gunel, 2013

What was the point of another experimental test of SWH?

So, it seems that from a theoretical point of view, so-called traditional teaching is likely to be ineffective in bringing about conceptual learning in science, whilst a constructivist approach based on the Science Writing Heuristic is likely to support such learning. Moreover, you are aware of a range of existing studies which suggest that in practice the Science Writing Heuristic is indeed an effective basis for science teaching.

So, what was the point of your study?

"The present study aimed to investigate the effect of the SWH approach compared to traditional chemistry instruction on grade 9 students' understanding of chemical change and mixture concepts."

Kingir, Geban & Gunel, 2013

Okay, I would certainly accept that just because a teaching approach has been found effective with one age group, or in one topic, or in one cultural context, we cannot assume those findings can be generalised and will necessarily apply in other teaching contexts (Taber, 2019).

Read about generalisation from studies

What happened in the experimental condition?

So, what happened in the two classes taught in the experimental condition?

"The teacher asked students to form their own small groups (n=5) and introduced to them the SWH approach …they were asked to suggest a beginning question…, write a claim, and support that claim with evidence…

they shared their questions, claims, and evidence in order to construct a group question, claim, and evidence. …each group, in turn, explained their written arguments to the entire class. … the rest of the class asked them questions or refuted something they claimed or argued. …the teacher summarized [and then] engaged students in a discussion about questions, claims, and evidence in order to make students aware of the meaning of those words. The appropriateness of students' evidence for their claims, and the relations among questions, claims, and evidence were also discussed in the classroom…

The teacher then engaged students in a discussion about …chemical change. First, the teacher attempted to elicit students' prior understanding about chemical change through questioning…The teacher asked students to write down what they wanted to learn about chemical change, to share those items within their group, and to prepare an investigation question with a possible test and procedure for the next class. While students constructed their own questions and planned their testing procedure, the teacher circulated through the groups and facilitated students' thinking through questioning…

Each group presented their questions to the class. The teacher and the rest of the class evaluated the quality of the question in relation to the big idea …The groups' procedures were discussed and revised prior to the actual laboratory investigation…each group tested their own questions experimentally…The teacher asked each student to write a claim about what they thought happened, and support that claim with the evidence. The teacher circulated through the classroom, served as a resource person, and asked …questions

…students negotiated their individual claims and evidence within their groups, and constructed group claims and evidence… each group…presented … to the rest of the class."

Kingir, Geban & Gunel, 2013
What happened in the control condition?

Okay, I can see that the experimental groups experienced the kind of learning activities that both educational theory and previous research suggests are likely to engage them and develop their thinking.

So, what did you set up to compare with the Science Writing Heuristic Approach as a fair test of its effectiveness as a pedagogy?

"In the comparison group, the teacher mainly used lecture and discussion[3] methods while teaching chemical change and mixture concepts. The chemistry textbook was the primary source of knowledge in this group. Students were required to read the related topic from the textbook prior to each lesson….The teacher announced the goals of the lesson in advance, wrote the key concepts on the board, and explained each concept by giving examples. During the transmission of knowledge, the teacher and frequently used the board to write chemical formula[e] and equations and draw some figures. In order to ensure that all of the students understood the concepts in the same way, the teacher asked questions…[that] contributed to the creation of a discussion[3] between teacher and students. Then, the teacher summarized the concepts under consideration and prompted students to take notes. Toward the end of the class session, the teacher wrote some algorithmic problems [sic 4] on the board and asked students to solve those problems individually….the teacher asked a student to come to the board and solve a problem…

The …nature of their laboratory activities was traditional … to verify what students learned in the classroom. Prior to the laboratory session, students were asked to read the procedures of the laboratory experiment in their textbook. At the laboratory, the teacher explained the purpose and procedures of the experiment, and then requested the students to follow the step-by-step instructions for the experiment. Working in groups (n=5), all the students conducted the same experiment in their textbook under the direct control of the teacher. …

The students were asked to record their observations and data. They were not required to reason about the data in a deeper manner. In addition, the teacher asked each group to respond to the questions about the experiment included in their textbook. When students failed to answer those questions, the teacher answered them directly without giving any hint to the students. At the end of the laboratory activity, students were asked to write a laboratory report in traditional format, including purpose, procedure, observations and data, results, and discussion. The teacher asked questions and helped students during the activity to facilitate their connection of laboratory activity with what they learned in the classroom.

Kingir, Geban & Gunel, 2013

The teacher variable

Often in small scale research studies in education, a different teacher teaches each group and so the 'teacher variable' confounds the experiment (Taber, 2019). Here, however, you avoid that problem 5, as you had a sample of four classes, and two different teachers were involved, each teaching one class in each condition?

"In order to facilitate the proper instruction of the SWH approach in the treatment group, the teachers were given training sessions about its implementation prior to the study. The teachers were familiar with the traditional instruction. One of the teachers was teaching chemistry for 20 years, while the other was teaching chemistry for 22 years at a high school. The researcher also asked the teachers to teach the comparison group students in the same way they taught before and not to do things specified for the treatment group."

Kingir, Geban & Gunel, 2013

Was this research ethical?

As this is an imaginary conversation, not all of the questions I might like to ask are actually addressed in the paper. In particular, I would love to know how the authors would justify that their study was ethical, considering that the control condition they set up deliberately excluded features of pedagogy that they themselves claim are necessary to support effective science learning:

"In traditional science teaching, it is difficult for the learners to change their misconceptions"

The authors beleive that "learning occurs through the negotiation of ideas", and their experimental condition provides plenty of opportunity for that. The control condition is designed to avoid the explicit elicitation of learners' idea, dialogic talk, or peer interactions when reading, listening, writing notes or undertaking exercises. If the authors' beliefs are correct (and they are broadly consistent with a wide consensus across the global science education research community), then the teaching in the comparison condition is not suitable for facilitating conceptual learning.

Even if we think it is conceivable that highly experienced teachers, working in a national context where constructivist teaching has long been official education policy, had somehow previously managed to only teach in an ineffective way: was it ethical to ask these teachers to teach one of their classes poorly even after providing them with professional development enabling them to adopt a more engaging approach better aligned with our understanding of how science can be effectively taught?

Read about unethical control conditions

Given that the authors already believed that –

  • "Students' misconceptions and learning difficulties constitute a major barrier for their learning in various chemistry topics"
  • "knowledge is constructed in the minds of students"
  • "The process of learning depends on the degree of the integration of prior knowledge with the new information"
  • "learning occurs through the negotiation of ideas"
  • "The SWH approach successfully integrates inquiry activities, collaborative group work, meaning making" – A range of previous studies have shown that SWH effectively supports student learning

– why did they not test the SWH approach against existing good practice, rather than implement a control pedagogy they knew should not be effective, so setting up two classes of learners (who do not seem to have been asked to consent to being part of the research) to fail?

Read about the expectation for voluntary informed consent

Why not set up a genuinely informative test of the SWH pedagogy, rather than setting up conditions for manufacturing a forgone conclusion?


When it has already been widely established that a pedagogy is more effective than standard practice, there is little point further testing it against what is believed to be ineffective instruction.

Read about level of contol in experiments


How can it be ethical to ask teachers to teach in a way that is expected to be ineffective?

  • transmission of knowledge
  • follow the step-by-step instructions
  • not required to reason in a deeper manner
  • individual working

A rhetorical experiment?

Is this not just a 'rhetorical' experiment engineered to produce a desired outcome (a demonstration), rather than an open-ended enquiry (a genuine experiment)?

A rhetorical experiment is not designed to produce substantially new knowledge: but rather to create the conditions for a 'positive' result (Figure 8 from Taber, 2019).

Read about rhetorical experiments


A technical question

Any study of a teaching innovation requires the commitment of resources and some disruption of teaching. Therefore any research study which has inherent design faults that will prevent it producing informative outcomes can be seen as a misuse of resources, and an unproductive disruption of school activities, and so, if only in that sense, unethical.

As the research was undertaken with "four intact classes" is it possible to apply any statistical tests that can offer meaningful results, when there are only two units of analysis in each condition? [That is, I think not.]

The researchers claim to have 117 degrees of freedom when applying statistical tests to draw conclusions. They seem to assume that each of the 122 children can be considered to be a separate unit of analysis. But is it reasonable to assume that c.30 children taught together in the same intact class by the same teacher (and working in groups for at least part of the time) are independently experiencing the (experimental or control) treatment?

Surely, the students within a class influence each other's learning (especially during group-work), so the outcomes of statistical tests that rely on treating each learner as an independent unit of analysis are invalid (Taber, 2019). This is especially so in the experimental treatment where dialogue (and "the negotiation of ideas") through group-work, discussion, and argumentation were core parts of the instruction.

Read about units of analysis

Sources cited:

  • Ausubel, D. P. (1968). Educational Psychology: A cognitive view. Holt, Rinehart & Winston.
  • Kingir, S., Geban, O., & Gunel, M. (2013). Using the Science Writing Heuristic Approach to Enhance Student Understanding in Chemical Change and Mixture. Research in Science Education, 43(4), 1645-1663. https://doi.org/10.1007/s11165-012-9326-x
  • Taber, K. S. (2019). Experimental research into teaching innovations: responding to methodological and ethical challengesStudies in Science Education, 55(1), 69-119. doi:10.1080/03057267.2019.1658058 [Download]

Notes:

1 I have used direct quotes from the published report in Research in Science Education (but I have omitted citations to other papers), with some emphasis added. Please refer to the full report of the study for further details. I have attempted to extract relevant points from the paper to develop an argument here. I have not deliberately distorted the published account by selection and/or omission, but clearly am only reproducing small extracts. I would recommend readers might access the original study in order to make up their own minds.


2 The next statement is "If individuals know little about the subject matter, new information is easily embedded in their cognitive structure (assimilation)." This is counter to the common thinking that learning about an unfamiliar topic is more difficult, and learning is made meaningful when it can be related to prior knowledge (Ausubel, 1968).

Read about making the unfamiliar familiar


3 The term 'discussion' might suggest an open-ended exchange of ideas and views. This would be a dialogic technique typical of constructivist approaches. From the wider context its seems likely something more teacher-directed and closed than this was meant here – but this is an interpretation which goes beyond the description available in the original text.

Read about dialogic learning


4 Researchers into problem-solving consider that a problem has to require a learner to do more that simply recall and apply previously learned knowledge and techniques – so an 'algorithmic problem' might be considered an oxymoron. However, it is common for teachers to refer to algorithmic exercises as 'problems' even though they do not require going beyond application of existing learning.


5 This design does avoid the criticism that one of the teacher may have just been more effective at teaching the topic to this age group, as both teachers teach in both conditions.

This does not entirely remove potential confounds as teachers interact differently with different classes, and with only four teacher-class combinations it could well be that there is better rapport in the two classes in one or other condition. It is very hard to see how this can be addressed (except by having a large enough sample of classes to allow inferential statistics to be used rigorously – which is not feasible in small scale studies).

A potentially more serious issue is 'expectancy' effects. There is much research in education and other social contexts to show that people's beliefs and expectations influence outcomes of studies – and this can make a substantial difference. If the two teachers were unconvinced by the newfangled and progressive approach being tested, then this could undermine their ability to effectively teach that way.

On the other hand, although it is implied that these teachers normally teach in the 'traditional' way, actually constructivist approaches are recommended in Turkey, and are officially sanctioned, and widely taught in teacher education and development courses. If the teachers accepted the arguments for believing the SWH was likely to be more effective at bringing about conceptual learning than the methods they were asked to adopt in the comparison classes, that would further undermine that treatment as a fair control condition.

Read about expectancy effects in research

Again, there is very little researchers can do about this issue as they cannot ensure that teachers participating in research studies are equally confident in the effectivenes of different treatments (and why should they be – the researchers are obviously expecting a substantive difference*), and this is a major problem in studies into teaching innovations (Taber, 2019).

* This is clear from their paper. Is it likely that they would have communicated this to the teachers? "The teachers were given training sessions about [SWH's] implementation prior to the study." Presumably, even if somehow these experienced teachers had previously managed to completely avoid or ignore years of government policy and guidance intending to persuade them of the value of constructivist approaches, the researchers could not have offered effective "training sessions" without explaining the rationales of the overall approach, and for the specific features of the SWH that they wanted teachers to adopt.


POEsing assessment questions…

…but not fattening the cow


Keith S. Taber


A well-known Palestinian proverb reminds us that we do not fatten the cow simply by repeatedly weighing it. But, sadly, teachers and others working in education commonly get so fixated on assessment that it seems to become an end in itself.


Images by Clker-Free-Vector-Images from PixabayOpenClipart-Vectors and Deedster from Pixabay

A research study using P-O-E

I was reading a report of a study that adopted the predict-observe-explain, P-O-E, technique as a means to elicit "high school students' conceptions about acids and bases" (Kala, Yaman & Ayas, 2013, p.555). As the name suggests, P-O-E asks learners to make a prediction before observing some phenomenon, and then to explain their observations (something that can be specially valuable when the predictions are based on strongly held intuitions which are contrary to what actually happens).

Read about Predict-Observe-Explain


The article on the publisher website

Kala and colleagues begin the introduction to their paper by stating that

"In any teaching or learning approach enlightened by constructivism, it is important to infer the students' ideas of what is already known"

Kala, Yaman & Ayas, 2013, p.555
Constructivism?

Constructivism is a perspective on learning that is informed by research into how people learn and a great many studies into student thinking and learning in science. A key point is how a learner's current knowledge and understanding influences how they make sense of teaching and what they go on to learn. Research shows it is very common for students to have 'alternative conceptions' of science topics, and often these conceptions either survive teaching or distort how it is understood.

The key point is that teachers who teach the science without regard to student thinking will often find that students retain their alternative ways of thinking, so constructivist teaching is teaching that takes into account and responds to the ideas about science topics that students bring to class.

Read about constructivism

Read about constructivist pedagogy

Assessment: summative, formative and diagnostic

If teachers are to take into account, engage with, and try to reshape, learners ideas about science topics, then they need to know what those ideas are. Now there is a vast literature reporting alternative conceptions in a wide range of science topics, spread across thousands or research reports – but no teacher could possibly find time to study them all. There are books which discuss many examples and highlight some of the most common alternative conceptions (including one of my own, Taber, 2014)



However, in any class studying some particular topic there will nearly always be a spread of different alternative conceptions across the students – including some so idiosyncratic that they have never been reported in any literature. So, although reading about common misconceptions is certainly useful to prime teachers for what to look out for, teachers need to undertake diagnostic assessment to find out about the thinking of their own particular students.

There are many resources available to support teachers in diagnostic assessment, and some activities (such as using concept cartoons) that are especially useful at revealing student thinking.

Read about diagnostic assessment

Diagnostic assessment, assessment to inform teaching, is carried out at the start of a topic, before the teaching, to allow teachers to judge the learners' starting points and any alternative conceptions ('misconceptions') they may have. It can therefore be considered aligned to formative assessment ('assessment for learning') which is carried out as part of the learning process, rather than summative assessment (assessment of leaning) which is used after studying to check, score, grade and certify learning.

P-O-E as a learning activity…

P-O-E can best support learning in topics where it is known learners tend to have strongly held, but unhelpful, intuitions. The predict stage elicits students' expectations – which, when contrary to the scientific account, can be confounded by the observe step. The 'cognitive conflict' generated by seeing something unexpected (made more salient by having been asked to make a formal prediction) is thought to help students concentrate on that actual phenomena, and to provide 'epistemic relevance' (Taber, 2015).

Epistemic relevance refers to the idea that students are learning about things they are actually curious about, whereas for many students following a conventional science course must be experienced as being presented with the answers to a seemingly never-ending series questions that had never occurred to them in the first place.

Read about the Predict-Observe-Explain technique

Students are asked to provide an explanation for what they have observed which requires deeper engagement than just recording an observation. Developing explanations is a core scientific practice (and one which is needed before another core scientific practice – testing explanations – is possible).

Read about teaching about scientific explanations

To be most effective, P-O-E is carried out in small groups, as this encourages the sharing, challenging and justifying of ideas: the kind of dialogic activity thought to be powerful in supporting learners in developing their thinking, as well as practicing their skills in scientific argumentation. As part of dialogic teaching such an open-forum for learners' ideas is not an end in itself, but a preparatory stage for the teacher to marshal the different contributions and develop a convincing argument for how the best account of the phenomenon is the scientific account reflected in the curriculum.

Constructivist teaching is informed by learners' ideas, and therefore relies on their elicitation, but that elicitation is never the end in itself but is a precursor to a customised presentation of the canonical account.

Read about dialogic teaching and learning

…and as a diagnostic activity

Group work also has another function – if the activity is intended to support diagnostic assessment, then the teacher can move around the room listening in to the various discussions and so collecting valuable information on what students think and understand. When assessment is intended to inform teaching it does not need to be about students completing tests and teachers marking them – a key principle of formative assessment is that it occurs as a natural part of the teaching process. It can be based on productive learning activities, and does not need marks or grades – indeed as the point is to help students move on in their thinking, any kind of formal grading whilst learning is in progress would be inappropriate as well as a misuse of teacher time.

Probing students' understandings about acid-base chemistry

The constructivist model of learning applies to us all: students, teachers, professors, researchers. Given what I have written above about P-O-E, about diagnostic assessment, and dialogic approaches to learning, I approached Kala and colleagues' paper with expectations about how they would have carried out their project.

These authors do report that they were able to diagnose aspects of student thinking about acids and bases, and found some learning difficulties and alternative conceptions,

"it was observed that eight of the 27 students had the idea that the "pH of strong acids is the lowest every time," while two of the 27 students had the idea that "strong acids have a high pH." Furthermore, four of the 27 students wrote the idea that the "substance is strong to the extent to which it is burning," while one of the 27 students mentioned the idea that "different acids which have equal concentration have equal pH."

Kala, Yaman & Ayas, 2013, pp.562-3

The key feature seems to be that, as reported in previous research, students conflate acid concentration and acid strength (when it is possible to have a high concentration solution of a weak acid or a very dilute solution of a strong acid).

Yet some aspects of this study seemed out of alignment with the use of P-O-E.

The best research style?

One feature was the adoption of a positivistic approach to the analysis,

Although there has been no reported analyzing procedure for the POE, in this study, a different [sic] analyzing approach was offered taking into account students' level of understanding… Data gathered from the written responses to the POE tasks were analyzed and divided into six groups. In this context, while students' prediction were divided into two categories as being correct or wrong, reasons for predictions were divided into three categories as being correct, partially correct, or wrong.

Kala, Yaman & Ayas, 2013, pp.560


GroupPredictionReasons
correctcorrect
correctpartially correct
correctwrong
wrongcorrect
wrongpartially correct
wrongwrong
"the written responses to the POE tasks were analyzed and divided into six groups"

There is nothing inherently wrong with doing this, but it aligns the research with an approach that seems at odds with the thinking behind constructivist studies that are intended to interpret a learner's thinking in its own terms, rather than simply compare it with some standard. (I have explored this issue in some detail in a comparison of two research studies into students' conceptions of forces – see Taber, 2013, pp.58-66.)

In terms of research methodology we might say it seem to be conceptualised within the 'wrong' paradigm for this kind of work. It seems positivist (assuming data can be unambiguously fitted into clear categories), nomothetic (tied to 'norms' and canonical answers) and confirmatory (testing thinking as matching model responses or not), rather than interpretivist (seeking to understand student thinking in its own terms rather than just classifying it as right or wrong), idiographic (acknowledging that every learner's thinking is to some extent unique to them) and discovery (exploring nuances and sophistication, rather than simply deciding if something is acceptable or not).

Read about paradigms in educational research

The approach used seemed more suitable for investigating something in the science laboratory, than the complex, interactive, contextualised, and ongoing life of classroom teaching. Kala and colleagues describe their methodology as case study,

"The present study used a case study because it enables the giving of permission to make a searching investigation of an event, a fact, a situation, and an individual or a group…"

Kala, Yaman & Ayas, 2013, pp.558
A case study?

Case study is a naturalistc methodology (rather than involving an intervention, such as an experiment), and is idiographic, reflecting the value of studying the individual case. The case is one from among many instances of its kind (one lesson, one school, one examination paper, etc.), and is considered as a somewhat self contained entity yet one that is embedded in a context in which it is to some extent entangled (for example, what happens in a particular lesson is inevitably somewhat influenced by

  • the earlier sequence of lessons that teacher taught that class {the history of that teacher with that class},
  • the lessons the teacher and student came from immediately before this focal lesson,
  • the school in which it takes place,
  • the curriculum set out to be followed…)

Although a lesson can be understood as a bounded case (taking place in a particular room over a particular period of time involving a specified group of people) it cannot be isolated from the embedding context.

Read about case study methodology


Case study – study of one instance from among many


As case study is idiographic, and does not attempt to offer direct generalisation to other situations beyond that case, a case study should be reported with 'thick description' so a reader has a good mental image of the case (and can think about what makes it special – and so what makes it similar to, or different from, other instances the reader may be interested in). But that is lacking in Kala and colleagues' study, as they only tell readers,

"The sample in the present study consisted of 27 high school students who were enrolled in the science and mathematics track in an Anatolian high school in Trabzon, Turkey. The selected sample first studied the acid and base subject in the middle school (grades 6 – 8) in the eighth year. Later, the acid and base topic was studied in high school. The present study was implemented, based on the sample that completed the normal instruction on the acid and base topic."

Kala, Yaman & Ayas, 2013, pp.558-559

The reference to a sample can be understood as something of a 'reveal' of their natural sympathies – 'sample' is the language of positivist studies that assume a suitably chosen sample reflects a wider population of interest. In case study, a single case is selected and described rather than a population sampled. A reader is left to rather guess what population being sampled here, and indeed precisely what the 'case' is.

Clearly, Kala and colleagues elicited some useful information that could inform teaching, but I sensed that their approach would not have made optimal use of a learning activity (P-O-E) that can give insight into the richness, and, sometimes, subtlety of different students' ideas.

Individual work

Even more surprising was the researchers' choice to ask students to work individually without group discussion.

"The treatment was carried out individually with the sample by using worksheets."

Kala, Yaman & Ayas, 2013, p.559

This is a choice which would surely have compromised the potential of the teaching approach to allow learners to explore, and reveal, their thinking?

I wondered why the researchers had made this choice. As they were undertaking research, perhaps they thought it was a better way to collect data that they could readily analyse – but that seems to be choosing limited data that can be easily characterised over the richer data that engagement in dialogue would surely reveal?

Assessment habits

All became clear near the end of the study when, in the final paragraph, the reader is told,

"In the present study, the data collection instruments were used as an assessment method because the study was done at the end of the instruction/ [sic] on the acid and base topics."

Kala, Yaman & Ayas, 2013, p.571

So, it appears that the P-O-E activity, which is an effective way of generating the kind of rich but complex data that helps a teacher hone their teaching for a particular group, was being adopted, instead, as means of a summative assessment. This is presumably why the analysis focused on the degree of match to the canonical science, rather than engaging in interpreting the different ways of thinking in the class. Again presumably, this is why the highly valuable group aspect of the approach was dropped in favour of individual working – summative assessment needs to not only grade against norms, but do this on the basis of each individual's unaided work.

An activity which offers great potential for formative assessment (as it is a learning activity as well as a way of exploring student thinking); and that offers an authentic reflection of scientific practice (where ideas are presented, challenged, justified, and developed in response to criticism); and that is generally enjoyed by students because it is interactive and the predictions are 'low stakes' making for a fun learning session, was here re-purposed to be a means of assessing individual students once their study of a topic was completed.

Kala and colleagues certainly did identify some learning difficulties and alternative conceptions this way, and this allowed them to evaluate student learning. But I cannot help thinking an opportunity was lost here to explore how P-O-E can be used in a formative assessment mode to inform teaching:

  • diagnostic assessment as formative assessment can inform more effective teaching
  • diagnostic assessment as summative assessment only shows where teaching has failed

Yes, I agree that "in any teaching or learning approach enlightened by constructivism, it is important to infer the students' ideas of what is already known", but the point of that is to inform the teaching and so support student learning. What were Kala and colleagues going to do with their inferences about students ideas when they used the technique as "an assessment method … at the end of the instruction".

As the Palestinian adage goes, you do not fatten up the cow by weighing it, just as you do not facilitate learning simply by testing students. To mix my farmyard allusions, this seems to be a study of closing the barn door after the horse has already bolted.


Work cited

An intelligent teaching system?

Imagine the banks were contracted to deliver school teaching…employing their digital assistants

Keith S. Taber

The teacher of the future – imagine the government invited bids to 'deliver' schooling most cost-effectively (only a matter of time?), and the banks undercut schools by proposing to use their existing intelligent technology.
Image by Gerd Altmann from Pixabay

Hi, I'm Rubis, your intelligent science teaching system. I can help with all sorts of everyday science learning queries. Let me know how I can help you today, or try asking me a short, simple question, such as "what is the formula of water?" and I'll be able to help.

Rubis, why is mercury a liquid at room temperature?

I didn't quite catch that, but I'm learning all the time.

Here are some of the top things that I get asked.

What are the three states of matter?
Why are mushrooms not considered plants?
What comes between gamma rays and ultraviolet radiation in the electromagnetic spectrum?
Who invented gravity?
Who was Archimedes' principal?

Rubis, why is mercury a liquid at room temperature?

One moment please, I'm just typing. Please select an option:

The periodic table
The respiratory system
Electrical resistance
Acids
Food webs

I'm trying to find out why mercury is a liquid at room temperature?

Just to check, do you mean:

What is the difference between a moon and a comet?
How dangerous is ascorbic acid?
Why is love like oxygen? 1
How can an acrobat balance on a trapezoid of forces?
When did retroviruses come back into vogue?
None of these

None of these

I can help with all sorts of everyday science learning queries. Try asking me a short, simple question, such as "what are the three ways in which heating occurs" and I'll be able to help.

Science teaching, of course, is a process that requires a well-prepared, highly knowledgeable, teacher who is sensitive to, and responsive to, individual learners whose moods and motivations shift from day to day and who may at any time ask a question that the teacher would never have predicted in advance. That's part of the joy (and challenge) of teaching, which when done well is an interactive process where the teacher has to make myriad real-time, 'on-line', decisions in response to a complex and ever changing context. That is, it requires a good deal of 'intelligence'.

Machine intelligence?

One (questionable) sign of 'progress' in our increasingly digital society is how getting access to costumer service usually involves perusing 'contact us' webpages that are designed to send you initially to fixed pages with answers to to 'FAQs' (frequently asked questions); only then offering 'chat' with an 'intelligent' assistant – which seems programmed to only pass you on to a human once it has recognised you have been sent round the same flow-chart loop more than once.

As I have pointed out before, this would be fine if these 'chat-bots' were actually intelligent (even in the limited sense of a learning machine).

(Read 'Cora and I: Living in two cultures')

My faux 'dialogue' above may seem an unfair parody. (Not a genuine dialogue of course as there was no meeting of minds and taking on board of the other's perspective.)

But if my bank (RBS) were to be given a contract to provide digital teaching services, I imagine it would be just like that. If they could do better, then surely their chat-bot for bank customers would be more useful and less frustrating.

The bank initiated the conversation…

My bank sent me an email suggesting that I would soon no longer be able to make on-line purchases with my debit card unless I used a mobile 'phone (which currently I do not, and am not planning to anytime soon). "As part of new regulations, we'll often ask you to confirm that it's really you who's making the purchase when you use your card to pay for something online. To do this, you'll need to enter a six-digit one-time passcode (OTP) that we'll send to your mobile phone." Of course the message also told me "Please do not reply to this email as the address is not monitored". Hm, so how do I get in touch…

Cora offers "support" and "an instant answer"

Cora, the chat-bot, could make no sense of my inquiry.

Of course no chat-bot (or teacher) could be expected to always respond to any random enquiry. When I was teaching I would not have been able to instantly answer a question from outside my teaching subjects…

"Sir, in what year did Hannibal Lecter cross the Alps with Elephants?" 2

Here, however, I was not raising some obscure topic that the Bank could not have anticipated, as I was RESPONDING to a conversation initiated by the Bank!

Connected with Cora

Perhaps I am being unreasonable?

Hey there. I'm Cora, your digital assistant. Let me know how I can help you today, or select one of the top questions that I am being asked:

Please select an option:

Close an account
Change my details
Order paying in book
Cancel transaction

Cora – 12:00

I choose to let 'her' know how she can help me today:

I have just had an email from the bank, suggesting in future I will need a mobile phone to make online purchases. At the present time I have no need for nor intention of getting a mobile phone.

Keith

I didn't quite catch that, but I'm learning all the time.

Here are some of the top things that I get asked.

Please select an option:

Coronavirus support
Unknown transaction
Cancel transaction
Pending transaction
Change account details
Make a payment

More

Cora – 12:01

More

selected

One moment please, I'm just typing.

Please select an option:

Check a statement
Help logging in
Paying in a cheque
Struggling financially
Check account info
Refunds/Retail disputes
None of the above

Cora – 12:02

None of the above

selected

I can help with all sorts of everyday banking queries. Try asking me a short, simple question, such as "how do I order a new card?" and I'll be able to help.

Cora – 12:02

I have just had an email from the bank, suggesting in future I will need a mobile phone to make online purchases. At the present time I have no need for nor intention of getting a mobile phone. Please explain how I should proceed.

Keith


Please select an option:

Just to check, do you mean:

Benefit Insurance Claim
Manage receipts in app
Check my eligibility
Suspicious email
Remove an account holder
None of these

Cora – 12.03

Just to check?

None of these

selected


I didn't quite catch that, but I'm learning all the time.

Here are some of the top things that I get asked.

Please select an option:

Coronavirus support
Unknown transaction
Cancel transaction
Pending transaction
Change account details
Make a payment

more

Cora – 12.03

more

selected

Please select an option:

Check a statement
Help logging in
Paying in a cheque
Struggling financially
Check account info
Refunds/Retail disputes
None of the above

Cora -12.03

None of the above

Selected

I can help with all sorts of everyday banking queries. Try asking me a short, simple question, such as "how do I order a new card?" and I'll be able to help.

Cora – 12:03

So, basically, Cora can help me as long as I ask a question she has specifically prepared for. I may as well have asked her how come a metal like mercury could be a liquid at room temperature.

Just imagine teachers behaved like that!

Original image by 14995841 from Pixabay
Footnotes:

1 apparently because if you get too much you will get too 'high', but if you do not get enough you will die

"Love is like oxygen
You get too much, you get too high
Not enough and you're gonna die
Love gets you high"

2 This suggests a science joke:

What was the outcome when Hannibal crossed the alps with elephants?

I am not sure, but the offspring would have been infertile.