Writing for the Journal of Petroleum, Chemical Industry, Chemistry Education, Medicine, Drug Abuse, and Archaeology

Just let me learn a new research field, and fire up the time machine, and I'll see what I can do

Keith S. Taber

An invitation from a petroleum journal where the editorial board are said to like my work, asking me to send them a unpublished medical article – preferably a couple of weeks before they wrote to me.

Dear Michael

Thank you for your kind message from the journal 'Petroleum and Chemical Industry International' (email, 23rd November, 2021).

It is always good to know people are noticing my work, and I was of course pleased to learn you had found my article 'Comment on "Increasing chemistry students' knowledge, confidence, and conceptual understanding of pH using a collaborative computer pH simulation"…'.

Given the title of the journal, I could be forgiven for being somewhat surprised that an article critiquing claims in an educational research study would attract your attention. So, to be told that the editorial board members of Petroleum and Chemical Industry International are "really impressed with [my] articles" is just incredible!

You ask me if I can send you some 'type of medical and clinical article'. I do not really think my work could strictly be described in those terms. Indeed, I initially wondered if my research might even fall outside the scope of Petroleum and Chemical Industry International: yet I see the journal has published some quite diverse material, including the wonderfully titled 'An attempt to Characterize Street Pharmaceutical Teachers Abusing Drugs and Aspect of Allergy Among Adult Men Attending Long Distance Institutions in Pune, India'.1 Moreover, I see an editorial for the journal published a few month ago focused on the conjecture that around the year 1100 CE the Yoruba of west Africa may have used glass beads as a form of currency.2

Is it fair then to assume that the journal has a fairly flexible approach to defining its scope, and that a submission that was outside of the 'medical and clinical' categories might still be considered for publication?

If that is so, I wonder what is currently a typical timescale for publication, should a submission be deemed suitable. Would a submission by your suggested deadline of 8th November, for example, have a reasonable chance of being published by, say, mid October?

Yours…

The journal homepage of Petroleum and Chemical Industry International offers a helpful tutorial for any potential contributors explaining what petroleum is and what it is used for
Notes:

1

A research article in Petroleum and Chemical Industry International

2

An editorial in Petroleum and Chemical Industry International

Not a leading international journal…

…of chemistry education…or even a journal of chemistry education

Keith S. Taber

One of these images shows a leading international research journal of chemistry education with academic quality standards and high production values. And the other…is not (any of these things).

I had received one of those unsolicited invitations to publish in the journal: "Write for Us". An editorial assistant wrote to tell me that

"I would appreciate receiving your submission on or before 10th November 2021"

email 'Write for Us – Journal of Chemistry: Education, Research and Practice' recieved on 22nd October

Publish in haste – retract at leisure?

Such requests to submit something, and quickly, but which are not associated with any special or themed issue, tempt me to write back and ask "why [would you so appreciate receiving my submission on or before 10th November 2021]?" Anyone who is a serious scholar or researcher will know both that producing an academic study takes a good deal of time and that decent journals have a rolling programme of submissions, peer review, and publication. So, it should not make any difference to the outcomes of a submission, or the approximate time from submission to publication, if one submitted on 10th November, or the 11th, or any other date when one had a manuscript ready. 1

So, these deadlines are really about marketing. Sometimes, some of these new journals which are struggling to establish themselves (and it is very easy for a publisher to start a journal these days, but very difficult to attract quality work – or well-qualified reviewers – given the extensive number of existing outlets), will offer reduced, or even set aside, publication fees for submissions received by a certain date to attract work, in order to help them start to build up a body of published studies which can convince other authors they have a viable and sustainable journal.

Here, however, if there was any particular motivation for me to respond by the implied deadline of 10th November, this was not shared.

Another journal of chemistry education research and practice?

I recall contacting the so-called Journal of Chemistry: Education Research and Practice before it even started publishing, when I was editor of a well-established and well-regarded journal with a very similar name: 'Chemistry Education Research and Practice' (CERP, published by the Royal Society of Chemistry).

A genuinely leading international journal – and a journal pretending to be one

I suggested that the proposed name risked the two journals being confused. I discussed this in an editorial:

"In October a colleague and former Board member of this journal was invited by the founding editor of the Journal of Chemistry: Education Research and Practice to join that new journal's editorial board. The journal name seemed very close to Chemistry Education Research and Practice, and I wrote to suggest they should avoid confusion by changing the name before they actually started publishing.

The editor replied to acknowledge that "we can understand your doubts" – and asked me to let them know if I wanted to be on the Board.

I wrote back to suggest again that they should modify the name to "allow the academic community to see your new journal as a genuine attempt to add to the range of scholarly publications in the field, rather than simply employing a cheap trick to mislead authors".

Taber, 2018, p.11

In view of the lack of concern about the similarity of name at the soon to be launched journal, I now suspect this similarity was likely deliberate – to conflate a top journal that did not charge publication fees with an unproven outlet that asks for a hefty fee.

A false claim (i.e., lie)

In any case, the journal website made it clear the journal was not actually specifically about chemistry education research and practice but was a general chemistry journal. The journal describes itself as:

"Journal of Chemistry: Education Research and Practice is a leading International Journal for the publication of high quality articles…It welcomes publication of scientific research papers in the fields of Theoretical and Physical Chemistry, Analytical and Inorganic Chemistry, Organic and Biological Chemistry, Applied and Materials Chemistry, Spectroscopy, Chemical physics, Biological, Medicinal, Environmental chemistry, Biochemistry, Petroleum and Petrochemicals, Materials science, Nuclear chemistry, Polymer chemistry, Pharmacognosy & Phytochemistry, Stereochemistry and Clinical chemistry"

Website of OPAST Group LLC, publisher of the dodgy journal

It is certainly not a 'leading International Journal' even if it genuinely aspires to be one. So, that is simply a false claim. Perhaps a reader might wonder if this is just my opinion – but the journal was making such a claim before it had begun publishing when there could be absolutely no basis for the lie.

"I wrote back pointing out that the statement on their website that the 'Journal of Chemistry: Education Research and Practice is a leading International Journal for the publication of high quality articles' had to be seen as a deliberately misleading claim given that the journal had not yet published a single article."

Taber, 2018, p.11

Who would want their scientific work published in an outlet which has such limited respect for truth? Is this meant to persuade researchers in the field – "it must be a leading journal, even though my colleagues in the field have never heard of it, because it says so there on the website". Or, are potential authors being invited to join in the conceit, perhaps, once having published in the journal, noting in their applications for scholarships, posts, promotions and so forth, that their work was published in one of the leading international journals?

A broad scope

The scope of the journal is clearly not just 'Chemistry: Education Research and Practice' if that is read to mean that it covers educational research and practice in chemistry. Perhaps they meant something more like – chemistry: education; research; and practice?

Indeed, chemistry education does not appear in the list above, although it does feature as one of a good many 'subject categories':

Analytical chemistry – Applied Chemistry – Biochemistry – Biological Chemistry – Chemical Biology – Chemical Sciences – Chemistry Education – Cryochemistry – Electrochemistry – Environmental Chemistry – Geochemistry – Green Chemistry – Histochemistry – Immunohistochemistry – Industrial Chemistry – Inorganic Chemistry – Material Chemistry – Medicinal Chemistry – Multi-disciplinary Chemistry – Nanochemistry – Nuclear Chemistry – Organic Chemistry – Petro Chemicals – Pharmaceutical chemistry – Photochemistry – Physical Chemistry – Phytochemicals – Polymer Chemistry – Supramolecular Chemistry – Theoretical Chemistry

https://opastonline.com/journal/journal-of-chemistry-education-research-and-practice

So that's pretty much 'chemistry' – with education research as very much one theme among many.

Parasitic, predatory, journals

To my eye, then, the so-called 'Journal of Chemistry: Education Research and Practice' looks like one of those many new journals that has been set up by people who do not really know about the relevant field, and who seek to charge authors for publishing their work without any substantive concern for scientific quality or scholarly values.

That is, the business model is about attracting enough submissions to make a profit. (Which is not in itself wicked, of course, as long as profit is made by offering an honest and competent service.) That requires publishing a lot of papers. That could be seen as motivation to have a very light touch editorial and peer review policy – after all, if submitted work is rejected or authors are asked to make major revisions this will reduce, and slow, the flow of funds into the publisher.

Respected academic journals, even when published by commercial publishing houses, have high quality criteria (rejecting much work, requiring substantial revisions before publication for most that are accepted), and know their reputations depend upon the field evaluating the work that is published as being (at least generally) of high quality.

Leading journals publish significant, original articles: other respectable journals may have to settle for well-motivated, well-designed, carefully executed and thoroughly reported work that adds incrementally to a field (even if not in a seminal way).

Some of the new journals being launched to publish for a fee are not only not yet 'leading' in their fields, but are not even worthy of respect. They provide a means of publication regardless of academic quality. They accept work which authors should (and perhaps later will) be embarrassed about and they do not offer the rigorous review process that helps authors appreciate weaknesses in their work and improve it.2 They are not contributing to a field, but parasitic on it.

That is a pattern I see quite a lot these days.

A prejudiced view?

However, it is unfair to prejudge the journal without looking to see if what it is publishing is actually quality work.

I looked at the most recent issue of Journal of Chemistry: Education Research and Practice, and saw it contained five papers – only one of which seemed to have anything to do with education – Chemistry Laboratory Safety Signs Awareness Among Undergraduate Students in Rivers State.

I decided to take look at the paper to see if I thought this article might indeed be of 'publishable quality' by one of the journals taken seriously in the field. Of course, all editors have bad days, and it would be wrong to scrutinse one education paper among many, and use that to characterise the general standard of work in a journal. So, I also looked back at previous issues, but found only a handful of other articles that seemed to be located in the field of education:

I also noticed a couple of articles on general chemical themes which looked like they might be of wider interest (and accessible to a non-specialist like myself).

So, I decided to take a quick look at these seven articles. I was aware I approached these studies with an existing bias based on the rather 'un-scholarly' and dishonest way in which this journal went about the business of attracting submissions. But I was also aware that even if a journal does not have careful procedures and proper editorial processes, this does not mean that it might not sometimes attract excellent work. I am only going to make brief comments here on most of these articles, but I have included links to more detailed discussions of them.

An invalid research instrument

The most noteworthy thing about the study 'Chemistry Laboratory Safety Signs Awareness Among Undergraduate Students in Rivers State' was that it used a data collection instrument which was invalid. The authors seemed to want to know if students would reognise the hazards signified by different laboratory signs, but provided a test instrument which told respondents the answer to this question as each sign was labelled with its meaning. The authors tested instead – inadvertently it seemed – whether students knew the hazards associated with a range of laboratory reagents.

(Read about 'Laboratory safety – not on the face of it')

A surprising research hypothesis

The article 'Students' Perception of Chemistry Teachers' Characteristics of Interest, Attitude and Subject Mastery in the Teaching of Chemistry in Senior Secondary Schools' reports a study using a questionnaire to study student perceptions of their chemistry teachers. The population of students sampled was reported to be "four hundred and ten (431)" but also "six hundred and thirty" students.

The study tested a hypothesis that there would not be a gender difference in student perceptions, and, indeed, found no statistically significant difference. (I suspected that I would not be visited by a fire inspector as I read his paper, and this also proved to be correct.) But then, no rationale have been given for thinking there was any reason to consider gender might be a factor – leaving a reader wondering what had motivated the test.

(Read about 'Not motivating a research hypothesis')

Out of scope and incomplete

The study 'An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum' was very short, and did not fit in the scope of journal as it was not about chemistry/chemistry education but medical education. The paper was incomplete in several senses – it did not have a full methodology section, and indeed did not seem to actually have any meaningful data analysis. It was also incomplete as it referred readers to figures which were not there: something that the author, the editor, and any peer reviewers who might have been invited to evaluate the work, seem to have all missed.

Indeed the article, which the journal bizarrely considered a review article (it was not), seemed to be the text of a conference poster which had been presented under a somewhat different authorship at different conferences. To see something so thin and insubstantial published in a supposed research journal is quite surprising.

(Read about 'The mystery of the disappearing authors')

A speculative proposal

The study 'Raman Spectroscopy: A Proposal for Didactic Innovation (IKD Model) In the Experimental Science Subject of the 3rd Year of the Primary Education Degree' does not report any empirical work, but just a proposal for a teaching sequence for including in undergraduate primary teacher education. It is suggested that these future primary teachers should prepare crystals from supersaturated solutions, and examine the different crystal shapes from different salts, and then run Raman spectra of them.

This activity is claimed to have a wide range of benefits at the levels of the undergraduates, their future teaching, and society more widely, but no evidence is presented for any of the claims. It seems to be suggested that these students will later want to use Raman spectroscopy in their primary school teaching. This is rather ambitious, and serious research journals would be unlikely to publish such a speculative proposal without any evaluation of the idea being put into practice.

(Read about 'Spectroscopy for primary school teachers?')

Comparing two (allegedly) below average schools

The article 'Assessment of Chemistry Laboratory Equipment Availability and Practice: A Comparative Study Between Damot and Jiga Secondary Schools' uses a rather dubious questionnaire to survey chemistry teachers and students in two schools (supposedly chosen as they have different approaches to chemistry lab. work, although nothing more is offered about what these approaches are) about their perceptions of aspects of chemistry practical work. The authors conclude that both schools have very low levels of both lab equipment and laboratory practice – although this seems to be based on an entirely arbitrary guess about what should be considered an average level.

The authors seem to want their study to be considered as comparative education, seemingly on the basis that they compare chemistry practical work in two neighbouring schools. There are problems with both the data collection and analysis aspects of the study.

(Read about 'Assessing Chemistry Laboratory Equipment Availability and Practice')

A fundamental challenge to chemistry

The article 'Nature of Chemical Elements' makes claims that are potentially of great interest to chemists and chemistry teachers everywhere: that there are errors in the periodic table as chemists have got the atomic numbers wrong for many of the chemical elements; a new model of nuclear structure explains the proton:neutron ratio in different atoms; and there are new elements to be discovered to fit the gaps that had not been noticed in the periodic table.

These are pretty major claims (were they to be substantiated, probably several Nobel prizes' worth!), and any respectable research journal would engage in very careful peer review before publishing such claims. However, the journal managed to complete editorial and peer review processes in four days, apparently not spotting or being concerned about a range of conceptual issues that I felt needed correction or clarification. Like most of the articles examined, the published study contains various sloppy errors which should have been questioned or corrected by the journal's production department.

(Read 'Move over Mendeleev, here comes the new Mendel')

An author embarrasses himself

I found 'The Chemistry of Indigenous Peoples' most disappointing as it was very brief and yet incoherent in places. It made the illogical claim that the survival of the way of life of indigenous people that live in the rainforest depends upon deforestation! There seemed to be odd errors and discontinuities (that seemingly had not been spotted by the editor or any peer reviewers asked to evaluate the work). After a while, I found the cause of this: a combination of poor translations and plagiarism.

Plagiarism is presenting someone else's work as your own. This paper in 'Journal of Chemistry: Education Research and Practice' that is supposed to be by one author, is actually a patchwork of paragraphs copied from three other published works by others.

This was the most disappointing read of the sample. I felt most of the studies at least represented honest attempts to contribute to the research literature even if all seemed to suffer from limited significance (although the article which wanted to overturn a good deal of canonical physics and chemistry was at least potentially significant), most raised unexplored issues of generalisation, and most included conceptual, logical and/or methodological weaknesses as well as language/typographical errors. However, stealing other people's scholarship, and presenting it as your own work is not just poor scholarship but academic malpractice.

(Read 'Can deforestation stop indigenous groups starving?')

This incoherent montage of other people's scholarship was also submitted to another journal two days before it was submitted to the 'Journal of Chemistry: Education, Research and Practice': it is also published in an outlet called 'Acta Scientific Pharmaceutical Sciences'.

(Read about 'A failure of peer review')

Poor quality work

In summary, from the papers I looked at, that is those in the journal that I felt most qualified to evaluate, the work in Journal of Chemistry: Education Research and Practice is not of 'publishable quality'. Some of the articles might be useful starting points for a publication, and may have been suitable for improvement and development through the peer review process. However, if there was any meaningful peer review of some of these papers, it was clearly not by anyone who was both qualified to, and prepared to, carefully evaluate the manuscripts.

This lets down the community as poor quality work appears in the literature. This journal also lets down the authors as they should expect their work to be challenged and so improved, through rigorous peer review – which clearly has not operated here. The exception is the author who simply translated and pasted segments of other people's work into an incoherent composite. That is not a matter of needing editorial support, but simply of learning that it is wrong to steal. That author let themselves down.

Work cited:

Taber, K. S. (2018). The end of academic standards? A lament on the erosion of scholarly values in the post-truth worldChemistry Education Research and Practice, 19(1), 9-14. doi:10.1039/C7RP90012K

Notes

1 Of course, there is the matter of claiming priority by publishing first. In the mythology of science this is very important – though in practice this is seldom as critical as the myth suggests. In science education someone would have to be incredibly unlucky to miss winning a major award or getting that dream job because they published a week or two after a colleague made substantially the same claim – I doubt this has ever occurred.


2 Peer review (psychologically, at least) can seem to be a bit like an irregular verb, in that my work does not really need peer review, but yours will benefit from it; the requests I get to change my submitted manuscripts are misguided, unhelpful or petty, but the recommendations I make about improving other people's work are appropriate, necessary and insightful.

Spectroscopy for primary school teachers?

Image by Schäferle from Pixabay 

Will Raman spectroscopy provide future primary teachers with "a dynamic and attractive vision of science, technology and innovation"?

Keith S. Taber

a proposal of methodology for the subject of experimental sciences for teachers in training, which will introduce real scientific instrumentation such as Raman spectroscopy, which can be of great interest to perform significant learning and to design teaching-learning activities

Morillas & Etxabe-Urbieta, 2020, p.17

I am going to offer a critical take on a proposal to teach future primary teachers to use Raman spectroscopy. That is, a proposal published in a leading international research journal (well, that is how the journal describes itself).

I do have some reservations about doing this: it is very easy to find fault in others' work (and a cynic might suggest that being an academic is basically a perpetual ongoing training in that skill). And there are features of the proposal that are appealing.

For a start, I like spectroscopy. I sometimes joke that my first degree was in spectroscopy and some of its applications (although the degree certificate refers to this as chemistry). I also like the way the paper refers to principles of models of learning, and refers to "combining concepts of chemistry and physics" (Morillas & Etxabe-Urbieta, 2020: 17).

However, I do wonder how closely (and critically) the editor and peer reviewers (assuming there really was peer review) actually read the submitted manuscript – given the range of questions one would expect to have arisen in review.

I will, below, question whether this contribution, a proposed teaching scheme, should really be considered a 'research' article. Even if one thinks it should be, I suggest the authors could have been better supported by the journal in getting their work ready for publication.

A predatory journal

I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality predatory journal. That is, a journal which encourages submissions simply to be able to charge a publication fee (currently $1519, according to the website), without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

Although the journal is called the Journal of Chemistry: Education Research and Practice (not to be confused, even if the publishers would like it to be, with the well-established journal Chemistry Education Research and Practice) only a few of the papers published are actually education studies. One of the articles that IS on an educational topic is called 'Raman Spectroscopy: A Proposal for Didactic Innovation (IKD Model) In the Experimental Science Subject of the 3rd Year of the Primary Education Degree' (Morillas & Etxabe-Urbieta, 2020).

A 'research article' in "a leading International Journal for the publication of high quality articles"

Like other work I have examined in this journal, the published article raises issues and questions which one would imagine should have arisen during peer review – that is when expert evaluators look to see if a manuscript has the importance and quality to be worthy of journal publication.

Below I very briefly outline the nature of the proposed innovation, and then offer some critique.

A proposal for didactic innovation in the primary education degree

Morillas and Etxabe-Urbieta (i) propose a sequence of practical science work for inclusion in the curriculum of undergraduate students who are preparing for primary school teaching, (ii) link this, in broad terms at least, to pedagogic principles, and (iii) make claims about the benefits of the mooted proposal.

The authors consider their proposal has originality, as they could not find other literature recommending the use of Raman spectroscopy in the preparation of primary school teachers,

"…the fact that there are no works related to Raman spectroscopy to work on concepts developed in experimental science class for Teacher training in Primary Education in formation, makes the proposal that is presented more important."

Morillas & Etxabe-Urbieta, 2020: 17

What exactly is proposed?

Morillas and Etxabe-Urbieta suggest an extended sequence of laboratory work with three main stages:

  • students are provided with three compounds (sodium nitrate; potassium nitrate; ammonium dihydrogen phosphate) from which they will prepare saturated solutions, from which crystals will grow ;
  • the resulting crystals will be inspected, and examples of crystals with clear shapes will be selected and analysed in terms of their geometries – showing how different compounds lead to different crystal structures
  • examples of ill-formed crystals will be subjected to Raman spectroscopy, where the three different compounds will give rise to different 'fingerprints'.

Pedagogic theory

Morillas and Etxabe-Urbieta report that their work is based on the 'IKD model' which equates to "new innovative teaching methodologies":

"In recent years, new innovative teaching methodologies have been used in the Basque Country University (IKD model) for experimental science classes for teachers of Primary Education in formation. This IKD model is based on a cooperative and dynamic learning. It is an own [?], cooperative, multilingual and inclusive model that emphasizes that students are the owners of their learning and are formed in a comprehensive, flexible and adapted to the needs of society. Training students according to IKD model requires creating new ways of teaching and learning more active and cooperative (curriculum development). Therefore, the fact of combining more theoretical master classes with more practical classes is a trend that is increasingly used."

Morillas & Etxabe-Urbieta, 2020: abstract

The authors name check constructivism, meaningful learning, and the notion of learning cycles, without offering much detail of what they mean by these terms.

"The students can put into practice the solubility concepts in master classes, through activities based on the IKD didactic model of the University of the Basque Country and in constructivist models of teaching and learning. Learning cycles have been developed and in a group, dynamic and cooperative way, the students explore their previous knowledge about solubility and crystallization, reflect on these ideas, make meaningful learning and apply these new learning in research contexts in the laboratory. In particular it has been discussed in the classroom about the amount of salt (compound) that can be dissolved in water and has been investigated on the factors that influence the solubility and on the crystallization process."

Morillas & Etxabe-Urbieta, 2020: 18

There is very little detail of how these pedagogic principles are built upon in the proposed teaching scheme, and the 'IKD model' is not explained in any more detail (e.g., how does 'multilingual' learning fit with this proposal?) After all, school children have been making saturated solutions and growing crystals for generations without this being understood as part of some innovative educational practice.

What is claimed?

Overall, the sequence is said to help link scientific theory to practice, teach geological concepts and provide hands-on experience of using modern scientific instruments,

"the first part, where the crystallization of various chemical compounds is carried out, will help to pinpoint possible doubts arising in the master classes of the chemistry part. Next, it is analyzed how to differentiate the crystals by means of their type of geometry in its crystallization based on geological concepts. Finally, the crystals are differentiated by another method based on the Raman spectroscopy…where students can observe concepts of light treated in physics class such as lasers, and electromagnetic lengths [sic?], where for the case in which some crystals that are not perfectly crystallized, this portable equipment will be used. In this way, the students have their first experience of this type, and use real scientific instrumentation."

Morillas & Etxabe-Urbieta, 2020: 18
Stage one: preparing crystals

But the authors suggest the approach has further impacts. Dissolving the salts, and then observing the crystals grow, "can help the student

  • to encourage possible scientific vocations,
  • better understanding of theoretical master classes and
  • letting them know how is [what it is like?] working in the scientific field and
  • spreading the importance of crystallography in our society" (p.18)

So, in addition to linking to theory classes ("students will begin to use the laboratory material studied in the theoretical classes, using and observing its characteristics, and in the same way trying to correlate the concepts of chemical saturation previously learnt in master classes", p.18), this simple practical work, is expected to change student views about science careers, give authentic experience of doing science, and increase social awareness of crystallography as a scientific field. Perhaps, but this seems a lot to expect from what is a pretty standard school practical activity.

However, in case we are not convinced, the authors reinforce their claims: students will experience principles they have been taught about saturated solutions, and how solubility [often] changes with temperature, and

"the students begin to experience the first fundamental concepts of crystallography and subsequently the fact of observing week after week the growth of the crystals themselves, can help the student to encourage possible scientific vocations, better understanding of theoretical master classes and letting them know how is working in the scientific field and spreading the importance of crystallography in our society."

Morillas & Etxabe-Urbieta, 2020: 19

Some of this is perfectly reasonable, but some of these claims seem speculative. (Simply repeating an unsupported claim again on the following page does not make it more convincing.) Authentic scientific activity would surely involve extended engagement, developing and testing a procedure over time to hone a process – crystallising solutions does not become an authentic science activity simply because the evaporation takes place over several weeks.

An editor or peer reviewer might reasonably ask "how do you know this activity will have these effects?"

Stage 2: Characterising crystals
Image by Lisa Redfern from Pixabay 

In the second stage, students examine the three types of crystals formed and notice and document that they have different shapes/geometries. This requires careful observation, and measurement (of angles),

In a second phase, once that month passed, the students will observe the crystals that have grown inside their containers. Firstly, one of the objectives will be to observe what kind of crystals have formed. For the observation methodology and subsequent for the description of them, teacher will give some guidelines to distinguish the formed crystals according to their geometry based on the geological morphology.

Morillas & Etxabe-Urbieta, 2020: 19

Growing and examining crystals seems a worthwhile topic in primary school as it can encourage awe and wonder in nature, and close observation of natural phenomena: the kinds of activities one might employ to engage young minds with the world of science (Taber, 2019). The authors expect (undergraduate) students to recognise the different crystal systems ("trigonal … orthorombic … tetragonal") and associated angles between faces. 1 This phase of the work is reasonably said to be able to

  • "promote skills such as visual and spatial perception"

It is the third stage of the work which seems to go beyond the scope of traditional work in preparing primary school teachers.

Stage 3: Using Raman spectroscopy to (i) identify compounds / (ii) appreciate particle movements

In this stage, groups of students are given samples of each of the compounds (from any of the students' specimens that did not crystallise well enough to be identified from the crystal shape), and they obtain Raman spectra from the samples, and so identify them based on being informed that the main spectral peak falls at a different wavenumber for each salt.

An inauthentic activity?

There is a sense that this rather falls down as an inquiry activity, as the students knew what the samples were, because they made the solutions and set up the crystallisations – and so presumably labelled their specimens as would be usual good scientific practice. The only reason they may now need to identity samples is because the teaching staff have deliberately mixed them up. Most school practical work is artificial in that sense, but it seems a little forced as an excuse to use spectroscopy. A flame test would surely have done the job more readily?

From 'Electrons and Wave-Particle Duality'

at http://www.sliderbase.com

A black box

Now the way the procedure is explained in the article, the spectrometer works as a black box that leads to spectra that (if all has gone well) have characteristics peaks at 1067 cm-1, 1048 cm-1 or 921 cm-1 allowing the samples to be distinguished. After all, a forensics expert does not have to understand how and why we all form unique fingerprints to be able to compare fingerprints found at a crime scene with those taken from suspects. (They simply need to know that fingerprints vary between people, and have skills in making valid matches.)

Yet Morillas and Etxabe-Urbieta (p.21) claim more: that undertaking this third part of the sequence will enable students to

  • "relate the type of movements that occur in the materials particles, in this case crystals, where the concept of particles movement…
  • the fact of lasers use in a realistic way helps also students to understand how these kinds of concepts exist in the reality and are not science fiction
  • …the use of this type of instrumentation in television series such as CSI, for example, means that students pay more attention in classrooms
  • and help them to grow a basic scientific curiosity in their professional work, that is, in the Primary Education classrooms"

Again, perhaps, but where is the evidence for this? If one wanted to persuade future teachers that lasers are not just science fiction, one could refer to a laser pointer, or a CD, DVD or Blu-ray player.

Final claims

The authors end their proposal with some final claims

"The methodology proposal presented in this work, based on IKD model explained [sic, I do not think it was – at least not in any detailed way] above, will offer to Primary Education degree students a great possibility of applicability as a teaching resource, in which the fact of using Raman spectroscopy as a real scientific instrumentation can fill them with curiosity, amazement and interest. Moreover, this technique cannot only be used as a complement to this type of work [?], but also for didactic innovation projects and research projects. Thus, the fact of being able to use this type of tools means that the students are stimulated by their curiosity and desire to advance and learn, progressing in their scientific concern and therefore, improving the delivery of their future classes in a more motivated, didactic and rigorous way."

Morillas & Etxabe-Urbieta, 2020: 21

A devil's advocate might counter that an activity to identify poorly crystallised salts by subjecting them to a black box apparatus that produces messy graphs which are interrogated in terms of some mysterious catalogue of spectral lines will do very little to encourage "curiosity, amazement and interest" among any future primary school teachers who already lack confidence and enthusiasm for science. Indeed, without a good understanding of a range of underlying physical principles, the activity can offer about as much insight into science as predicting the outcome of a football match from a guide to interpreting tea leaves.

So, perhaps less like identifying fingerprints, and more like reading palms.

The references to "offer to Primary Education degree students a great possibility of applicability as a teaching resource" and "improving the delivery of their future classes in a more motivated, didactic and rigorous way" seems to mean 1 that the authors are not just suggesting that the undergraduates might benefit from this as learners, but also that they may want to introduce Raman spectroscopy into their own future teaching in primary schools.

That seems ambitious.

Spectroscopy in the school curriculum

Spectroscopy does appear in the upper levels of the secondary school curriculum, but not usually Raman spectroscopy.

Arguably, mass spectrometry 2 is most accessible as a general idea as it can be explained in terms of basic physical principles that are emphasised in school physics – mass, charge, force, acceleration… 'Mass spec.' – the chemist's elemental analyser – also offers a nice context for talking about the evidence for the existence of elements with distinct atomic numbers, and for looking at isotopic composition, as well as distinguishing elements and compounds, and testing for chemical changes (Taber, 2012).

'Mass spec.' is, however, rather different to the other main types of spectroscopy in which samples are subjected to electromagnetic radiation and the outcome of any interaction detected. 2

Image by Daniel Roberts from Pixabay 

Most spectroscopy involves firing a beam of radiation at a sample, shifting gradually through a frequency range, to see which frequencies are absorbed or re-emitted. Visible spectroscopy is perhaps the most accessible form as the principle can initially be introduced with simple hand-held spectroscopes that can be used to examine different visible light sources – rather than having to interpret chart recorder or computer screen graphics. Once students are familiar with these spectroscopes, more sophisticated spectrometers can be introduced. UV-Visible (UV-Vis) spectroscopy can be related to teaching about electronic energy levels, for example in simple models of atomic structure.

Infrared (IR) spectroscopy has similar principles, and can be related to the vibrations in molecules due to the presence of different bonds. Vibrational energy levels tend to be much closer together than discrete 3 electronic levels

In these types of spectroscopy, some broad ranges of frequencies of radiation are largely unaffected by the test sample but within these bands are narrow ranges of radiation that are being absorbed to a considerable extent. These 'spectral peaks' of frequencies* of the radiation being removed (or heavily attenuated) from the spectrum reflect energy transitions due to electrons or bonds being excited to higher energy levels. (Although energy absorbed will often then be re-emitted, it will be emitted in arbitrary directions so very little will end up aligned with the detector.)

[* Traditionally in spectroscopy the peaks are labelled not with radiation frequency by with wavenumber in cm-1 (waves per cm). This is the reciprocal of wavelength, λ, (in cm), and so directly proportional to frequency, as speed of the radiation c = fλ.]

A more subtle kind of spectrocsopy

Raman spectroscopy is inherently more complex, and relies on interactions between the material under test and a very small proportion of the incident radiation. Raman spectroscopy relies on a scattering effect, so as a simple analogy it is like UV/Visible or IR spectroscopy but involving something more like a Doppler shift than simple absorption. Thus the need for a monochromatic light source (the laser) as the detector is seeking shifts from the original frequency.

Figure taken form the open-access article: Xu, Yu, Zois, Cheng, Tang, Harris & Huang, 2021

So, if introducing spectroscopy one would be better advised to start with UV-Vis (or IR) where there is a strong contrast in absorption between unaffected and affected frequencies, and where there is a direct relationship between the energy of the affected radiation and the energy transitions being indirectly detected (rather than Raman spectroscopy where there is only a marginal difference between affected and unaffected frequencies, and the scattered radiation does not directly give the frequencies of the energy shifts being indirectly detected).

Learning quanta – teaching through an Aufbau principle

As learning tends to be an incremental process, building on existing knowledge, it would probably make sense to

  • introduce spectroscopy in terms of UV-Vis, first with hand held spectroscopes, then spectrometers
  • then extend this to IR which is similar in terms of basic principles and so would reinforce that learning.

Only later, once this basic understanding had been sufficiently revisited and consolidated, would it seem to make sense to

  • move onto the more complex nature of Raman spectroscopy (or nuclear magnetic resonance spectroscopy which involves similar complications).

This, at least, would seem to be a constructivist approach – which would align with Morillas and Etxabe-Urbieta's claim of employing "Teaching and Learning processes based on Constructivism theories and IKD model of the Basque Country University" (p.18).

That is, of course, if it is felt important enough to teach primary school teachers about spectroscopy.

…and as if by magic…

Actually, I am not at all convinced that

"thanks to the visualization of these spectra, students can relate the type of movements that occur in the materials particles, in this case crystals, where the concept of particles movement, which is quite abstract, can be understood"

The future teachers could certainly be taught that

"this type of technique consists in that the laser of the equipment (in our case red laser) when striking on the crystals promotes an excitation of the molecules [sic, ions?] of the own crystal, that can vibrate, rotate [sic 4] etc. This type of excitation is translated into a spectrum (different peaks) that is displayed on the screen of a computer connected to the Raman spectrometer. These peaks refer to different vibrational modes of the molecules [sic], so that each of the bands of each spectrum, corresponds to different parts of the molecule [sic], so as it has been mentioned above, each of the crystals has its own fingerprint"

Morillas & Etxabe-Urbieta, 2020: 20

Yet that seems some way short of actually relating the spectra to the "type of movements that occur in the materials particles". (In terms of my fingerprint analogy, this is like being taught that the unique fingerprint reflects the epigenetic development of the individual, and so appreciating why different people have different fingerprints, but still not being able to relate the specific fingerprints of individual to any specific events in their development.)

Not a research paper – or even a practice paper?

I do not think this article would been publishable in a serious research journal, as it does not seem to report any actual research. It discusses educational practice, but it is not clear if this is practice that currently takes place or is simply being proposed. Even if this is reporting actual teaching practice, there is no evaluation of that practice.

The idea that Raman spectroscopy might be beneficial to future primary school teachers seems somewhat speculative. I have no doubt it could potentially be of some value. All other things being equal, the more science that primary school teachers know, understand, and are confident about, the better for them and their future pupils.

But of course, all other things are seldom equal. In general, teaching something new means less time for something else. Either Raman spectroscopy replaces something, or it squeezes the time available, and therefore the engagement and depth of treatment possible, in some other curriculum content.

So, rather than making great claims about how including Raman spectroscopy in the curriculum will help learn theory (will they really understand how a laser produces coherent monochromatic light, and how and why scattering takes place?), provide experience of scientific work (with an artificial exercise?), lead to scientific vocations (instead of becoming primary teachers?), and raise social awareness of crystallography, etc., what is needed is evidence that some of these educational aims and objectives are being met. And, ideally, that there is more educational gain with this activity than whatever it replaced.

I am certainly not looking to reject this proposal out of hand. I can see the sequence could engage students and be enjoyable, and may well have positive outcomes. But simply making a range of unsubstantiated claims is not research. A speculative proposal offering tenuous arguments for knowledge claims is not sufficient for a research paper.

Evaluating these claims would not be that easy (some of the effects claimed are pretty long term and indirect) but it is only when a claim is closely argued, and preferably based on empirical evidence, that it become science and ready for publication in a research journal.

Peer review

Now the editor of Journal of Chemistry: Education Research and Practice may disagree with me (at least, assuming she scrutinised the article before it was published). 5 But supposedly this journal undertakes formal peer review – that is experts in a topic are asked to evaluate submissions for suitability for publication – not only to make a recommendation on whether something should be published, but to raise any issues that need addressing before such publication.

I wonder who reviewed this submission (were they experts in primary teacher education?) and what, if any, suggestions for revisions these referees may have made. There are a good many points where one would expect a referee to ask for something to be explained or justified or corrected (e.g., molecules and rotations in salt crystals). Some of these points should be obvious to any careful reader (like asking what exactly is the IKD model that informs this proposal, and where are different features of the model enacted in the teaching sequence?) There are also places where the authors could have been supported to hone their text to make their intended meanings clearer. (I have considerable respect for authors writing in a second language, but that is not an excuse for journal editors and production staff to ignore incorrect or confusing expressions.)

The editor decided the manuscript was ready for publication about 10 days after initial submission

Yet, based on any peer reviews reports, and the authors' responses to them, the editor was able to decide the manuscript was ready for publication about 10 days after initial submission.

A brave conjecture?

Given that the proposal here is likely to seem, on the face of it, quite bizarre to many of those working in primary teacher education, who are charged with ensuring future primary teachers have a good grounding of the most basic scientific concepts, values and practices, and feel confident about teaching science to children, it risks being dismissed out of hand unless very closely and carefully argued.

"…the fact that there are no works related to Raman spectroscopy to work on concepts developed in experimental science class for Teacher training in Primary Education in formation, makes the proposal that is presented more important [but also puts a high burden on the proposer to make a convincing argument for the proposal]"

Morillas & Etxabe-Urbieta, 2020: 17

So, even if the editor felt that an unproved pedagogic proposal was of itself suitable to be the basis of a research article, there is much that could have been done in editorial and peer review to support the authors in improving their manuscript to give a stronger article. After all, I suspect very few academics working in initial teacher education with future primary teachers would inherently think that Raman spectroscopy is a strong candidate for adding to the curriculum, so the case needs all the argumentation, logic and evidential support it can muster if it is be taken seriously.

Work cited:
  • IUPAC. Compendium of Chemical Terminology, 2nd ed. (the "Gold Book"). Compiled by A. D. McNaught and A. Wilkinson. Blackwell Scientific Publications, Oxford (1997). Online version (2019-) created by S. J. Chalk. ISBN 0-9678550-9-8. https://doi.org/10.1351/goldbook.
  • Morillas, H., & Etxabe-Urbieta, J. M. (2020). Raman Spectroscopy: A Proposal for Didactic Innovation (IKD Model) In the Experimental Science Subject of the 3rd Year of the Primary Education Degree. Journal of Chemistry: Education Research and Practice, 4(1), 17-21.
  • Rajawat, J., & Jhingan, G. (2019). Chapter 1 – Mass spectroscopy. In G. Misra (Ed.), Data Processing Handbook for Complex Biological Data Sources (pp. 1-20): Academic Press.
  • Schmälzlin, E., Moralejo, B., Rutowska, M., Monreal-Ibero, A., Sandin, C., Tarcea, N., Popp, L. and Roth, M.M. (2014). Raman Imaging with a Fiber-Coupled Multichannel Spectrograph. Sensors 14, no. 11: 21968-21980. https://doi.org/10.3390/s141121968
  • Taber, K. S. (2012). Key concepts in chemistry. In K. S. Taber (Ed.), Teaching Secondary Chemistry (2nd ed., pp. 1-47). London: Hodder Education.
  • Taber, K. S. (2019). Exploring, imagining, sharing: Early development and education in science. In D. Whitebread, V. Grau, K. Kumpulainen, M. M. McClelland, N. E. Perry, & D. Pino-Pasternak (Eds.), The SAGE Handbook of Developmental Psychology and Early Childhood Education (pp. 348-364). London: Sage.
  • Xu, J., Yu, T., Zois, C. E., Cheng, J.-X., Tang, Y., Harris, A. L., & Huang, W. E. (2021). Unveiling Cancer Metabolism through Spontaneous and Coherent Raman Spectroscopy and Stable Isotope Probing. Cancers, 13(7), 1718.

Notes

1 Throughout the paper I would have appreciated an indication of which aspects of the activity were intended purely for the education of the future teachers themselves and which aspects were meant to be modelled for future use in primary classrooms.


2 Is spectroscopy the same as spectrometry? Strictly these terms have different meanings. According to the International Union of Pure and Applied Chemistry (IUPAC, 2019-):

  • spectroscopy is "the study of physical systems by the electromagnetic radiation with which they interact or that they produce"

whereas

  • "spectrometry is the measurement of such radiations as a means of obtaining information about the systems and their components."

And

  • mass spectroscopy is "the study of systems by causing the formation of gaseous ions, with or without fragmentation, which are then characterized by their mass-to-charge ratios and relative abundances."
  • mass spectrometry is "the branch of science dealing with all aspects of mass spectroscopes and the results obtained with these instruments"
  • a mass spectrograph is "an instrument in which beams of ions are separated (analysed) according to the quotient mass/charge, and in which the deflection and intensity of the beams are recorded directly on photographic plate or film"

So that has cleared that up!

In practice the terms spectroscopy and spectrometry are often used synonymously, even in relation to mass spectrometry (e.g., Rajawat & Jhingan, 2019) which strictly does not involve the interaction of matter with radiation.


3 Discrete, as this would not apply to the near continuum bands of energy levels found in metals for example.


4 Although I am not convinced that rotational modes of excitation can be detected in a solid crystal.


5 The editor of a research journal is the person who makes publication decisions. However, predatory journals do not always operate like serious research journals – and it may be that sometimes these decisions are made by admin. staff and the editor's name is just used as a sop to respectability. I do not know if that is the case with this journal, but I think by any normal academic standards some very dubious editorial decisions are being made by someone!


Assessing Chemistry Laboratory Equipment Availability and Practice

Comparative education on a local scale?

Keith S. Taber

Image by Mostafa Elturkey from Pixabay 

I have just read a paper in a research journal which compares the level of chemistry laboratory equipment and 'practice' in two schools in the "west Gojjam Administrative zone" (which according to a quick web-search is in the Amhara Region in Ethiopia). According to Yesgat and Yibeltal (2021),

"From the analysis of Chemistry laboratory equipment availability and laboratory practice in both … secondary school and … secondary school were found in very low level and much far less than the average availability of chemistry laboratory equipment and status of laboratory practice. From the data analysis average chemistry laboratory equipment availability and status of laboratory practice of … secondary school is better than that of Jiga secondary school."

Yesgat and Yibeltal, 2021: abstract [I was tempted to omit the school names in this posting as I was not convinced the schools had been treated reasonably, but the schools are named in the very title of the article]

Now that would seem to be something that could clearly be of interest to teachers, pupils, parents and education administrators in those two particular schools, but it raises the question that can be posed in relation to any research: 'so what?' The findings might be a useful outcome of enquiry in its own context, but what generalisable knowledge does this offer that justifies its place in the research literature? Why should anyone outside of West Gojjam care?

The authors tell us,

"There are two secondary schools (Damot and Jiga) with having different approach of teaching chemistry in practical approach"

Yesgat and Yibeltal, 2021: 96

So, this suggests a possible motivation.

  • If these two approaches reflect approaches that are common in schools more widely, and
  • if these two schools can be considered representative of schools that adopt these two approaches, and
  • if 'Chemistry Laboratory Equipment Availability and Practice' can be considered to be related to (a factor influencing? an effect of?) these different approaches, and
  • if the study validly and reliably measures 'Chemistry Laboratory Equipment Availability and Practice', and
  • if substantive differences are found between the schools

then the findings might well be of wider interest. As always in research, the importance we give to findings depends upon a whole logical chain of connections that collectively make an argument.

Spoiler alert!

At the end of the paper, I was none the wiser what these 'different approaches' actually were.

A predatory journal

I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality 'predatory journal'. That is, a journal which encourages submissions simply to be able to charge a publication fee (currently $1519, according to the website), without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

Although the journal is called the Journal of Chemistry: Education Research and Practice (not to be confused, even if the publishers would like it to be, with the well-established journal Chemistry Education Research and Practice) only a few of the papers published are actually education studies. One of the articles that IS on an educational topic is called 'Assessment of Chemistry Laboratory Equipment Availability and Practice: A Comparative Study Between Damot and Jiga Secondary Schools' (Yesgat & Yibeltal, 2021).

Comparative education?

Yesgat and Yibeltal imply that their study falls in the field of comparative education. 1 They inform readers that 2,

"One purpose of comparative education is to stimulate critical reflection about our educational system, its success and failures, strengths and weaknesses. This critical reflection facilitates self-evaluation of our work and is the basis for determining appropriate courses of action. Another purpose of comparative education is to expose us to educational innovations and systems that have positive outcomes. Most compartivest states [sic] that comparative education has four main purposes. These are:

To describe educational systems, processes or outcomes

To assist in development of educational institutions and practices

To highlight the relationship between education and society

To establish generalized statements about education that are valid in more than one country

Yesgat & Yibeltal, 2021: 95-96
Comparative education studies look to characterise (national) education systems in relation to their social/cultural contexts (Image by Gerd Altmann from Pixabay)

Of course, like any social construct, 'comparative education' is open to interpretation and debate: for example, "that comparative education brings together data about two or more national systems of education, and comparing and contrasting those data" has been characterised as an "a naive and obvious answer to the question of what constitutes comparative education" (Turner, 2019, p.100).

There is then some room for discussion over whether particular research outputs should count as 'comparative education' studies or not. Many comparative education studies do not actually compare two educational systems, but rather report in detail from a single system (making possible subsequent comparisons based across several such studies). These educational systems are usually understood as national systems, although there may be a good case to explore regional differences within a nation if regions have autonomous education systems and these can be understood in terms of broader regional differences.

Yet, studying one aspect of education within one curriculum subject at two schools in one educational educational administrative area of one region of one country cannot be understood as comparative education without doing excessive violence to the notion. This work does not characterise an educational system at national, regional or even local level.

My best assumption is that as the study is comparing something (in this case an aspect of chemistry education in two different schools) the authors feel that makes it 'comparative education', by which account of course any educational experiment (comparing some innovation with some kind of comparison condition) would automatically be a comparative education study. We all make errors sometimes, assuming terms have broader or different meanings than their actual conventional usage – and may indeed continue to misuse a term till someone points this out to us.

This article was published in what claims to be a peer reviewed research journal, so the paper was supposedly evaluated by expert reviewers who would have provided the editor with a report on strengths and weaknesses of the manuscript, and highlighted areas that would need to be addressed before possible publication. Such a reviewer would surely have reported that 'this work is not comparative education, so the paragraph on comparative education should either be removed, or authors should contextualise it to explain why it is relevant to their study'.

The weak links in the chain

A research report makes certain claims that derive from a chain of argument. To be convinced about the conclusions you have to be convinced about all the links in the chain, such as:

  • sampling (were the right people asked?)
  • methodology (is the right type of research design used to answer the research question?)
  • instrumentation (is the data collection instrument valid and reliable?)
  • analysis (have appropriate analytical techniques been carried out?)

These considerations cannot be averaged: if, for example, a data collection instrument does not measure what it is said to measure, then it does not matter how good the sample, or how careful the analysis, the study is undermined and no convincing logical claims can be built. No matter how skilled I am in using a tape measure, I will not be able to obtain accurate weights with it.

Sampling

The authors report the make up of their sample – all the chemistry teachers in each school (13 in one, 11 in the other), plus ten students from each of grades 9, 10 and 11 in each school. They report that "… 30 natural science students from Damot secondary school have been selected randomly. With the same technique … 30 natural sciences students from Jiga secondary school were selected".

Random selection is useful to know there is no bias in a sample, but it is helpful if the technique for randomisation is briefly reported to assure readers that 'random' is not being used as a synonym for 'arbitrary' and that the technique applied was adequate (Taber, 2013b).

A random selection across a pooled sample is unlikely to lead to equal representation in each subgroup (From Taber, 2013a)

Actually, if 30 students had been chosen at random from the population of students taking natural sciences in one of the schools, it would be extremely unlikely they would be evenly spread, 10 from each year group. Presumably, the authors made random selections within these grade levels (which would be eminently sensible, but is not quite what they report).

Read about the criterion for randomness in research

Data collection

To collect data the authors constructed a questionnaire with Likert-type items.

"…questionnaire was used as data collecting instruments. Closed ended questionnaires with 23 items from which 8 items for availability of laboratory equipment and 15 items for laboratory practice were set in the form of "Likert" rating scale with four options (4=strongly agree, 3=agree, 2=disagree and 1=strongly disagree)"

Yesgat & Yibeltal, 2021: 96

These categories were further broken down (Yesgat & Yibeltal, 2021: 96): "8 items of availability of equipment were again sub grouped in to

  • physical facility (4 items),
  • chemical availability (2 items), and
  • laboratory apparatus (2 items)

whereas 15 items of laboratory practice were further categorized as

  • before actual laboratory (4 items),
  • during actual laboratory practice (6 items) and
  • after actual laboratory (5 items)

Internal coherence

So, there were two basic constructs, each broken down into three sub-constructs. This instrument was piloted,

"And to assure the reliability of the questionnaire a pilot study on a [sic] non-sampled teachers and students were conducted and Cronbach's Alpha was applied to measure the coefficient of internal consistency. A reliability coefficient of 0.71 was obtained and considered high enough for the instruments to be used for this research"

Yesgat & Yibeltal, 2021: 96

Running a pilot study can be very useful as it can highlight issues about items. However, although simply asking people to complete a questionnaire might highlight items people could not make any sense of, it may not be as useful as interviewing them about how they understood items to check that respondents understand items in the same way as researchers.

The authors cite the value of Cronbach's alpha to demonstrate their instrument has internal consistency. However, they seem to be quoting the value obtained in the pilot study, where the statistic strictly applies to a particular administration of an instrument (so the value from the main study is more relevant to the results reported).

More problematic, the authors appear to cite a value of alpha from across all 23 items (n.b., the value of alpha tends to increase as the number of items increases, so what is considered an acceptable value needs to allow for the number of items included) when these are actually two distinct scales: 'availability of laboratory equipment' and 'laboratory practice'. Alpha should be quoted separately for each scale – values across distinct scales are not useful (Taber, 2018). 3

Do the items have face validity?

The items in the questionnaire are reported in appendices (pp.102-103), so I have tabulated them here, so readers can consider

  • (a) whether they feel these items reflect the constructs of 'availability of equipment' and 'laboratory practice';
  • (b) whether the items are phrased in a clear way for both teachers and students (the authors report "conceptually the same questionnaires with different forms were prepared" (p.101) but if this means different wording fro teachers than students this is not elaborated – teachers were also asked demographic questions about their educational level)); and
  • (c) whether they are all reasonable things to expect both teachers and students to be able to rate.
'Availability of equipment' items'Laboratory practice' items
Structured and well- equipped laboratory roomYou test the experiments before your work with students
Availability of electric system in laboratory roomYou give laboratory manuals to student before practical work
Availability of water system in laboratory roomYou group and arrange students before they are coming to laboratory room
Availability of laboratory chemicals are available [sic]You set up apparatus and arrange chemicals for activities
No interruption due to lack of lab equipmentYou follow and supervise students when they perform activities
Isolated bench to each student during laboratory activitiesYou work with the lab technician during performing activity
Chemicals are arranged in a logical order.You are interested to perform activities?
Laboratory apparatus are arranged in a logical orderYou check appropriate accomplishment of your students' work
Check your students' interpretation, conclusion and recommendations
Give feedbacks to all your students work
Check whether the lab report is individual work or group
There is a time table to teachers to conduct laboratory activities.
Wear safety goggles, eye goggles, and other safety equipment in doing so
Work again if your experiment is failed
Active participant during laboratory activity
Items teachers and students were asked to rate on a four point scale (agree / strongly agree / disagree / strongly disagree)

Perceptions

One obvious limitation of this study is that it relies on reported perceptions.

One way to find out about the availability of laboratory equipment might be to visit teaching laboratories and survey them with an observation schedule – and perhaps even make a photographic record. The questionnaire assumes that teacher and student perceptions are accurate and that honest reports would be given (might teachers have had an interest in offering a particular impression of their work?)

Sometimes researchers are actually interested in impressions (e.g., for some purposes whether a students considers themselves a good chemistry student may be more relevant than an objective assessment), and sometimes researchers have no direct access to a focus of interest and must rely on other people's reports. Here it might be suggested that a survey by questionnaire is not really the best way to, for example, "evaluate laboratory equipment facilities for carrying out practical activities" (p.96).

Findings

The authors describe their main findings as,

"Chemistry laboratory equipment availability in both Damot secondary school and Jiga secondary school were found in very low level and much far less than the average availability of chemistry laboratory equipment. This finding supported by the analysis of one sample t-values and as it indicated the average availability of laboratory equipment are very much less than the test value and the p-value which is less than 0.05 indicating the presence of significant difference between the actual availability of equipment to the expected test value (2.5).

Chemistry laboratory practice in both Damot secondary school and Jiga secondary school were found in very low level and much far less than the average chemistry laboratory practice. This finding supported by the analysis of one sample t-values and as it indicated the average chemistry laboratory practice are very much less than the test value and the p-value which is less than 0.05 indicating the presence of significant difference between the actual chemistry laboratory practice to the expected test value."

Yesgat & Yibeltal, 2021: 101 (emphasis added)

This is the basis for the claim in the abstract that "From the analysis of Chemistry laboratory equipment availability and laboratory practice in both Damot secondary school and Jiga secondary school were found in very low level and much far less than the average availability of chemistry laboratory equipment and status of laboratory practice."

'The average …': what is the standard?

But this raises a key question – how do the authors know what the "the average availability of chemistry laboratory equipment and status of laboratory practice" is, if they have only used their questionnaire in two schools (which are both found to be below average)?

Yesgat & Yibeltal have run a comparison between the average ratings they get from the two schools on their two scales and the 'average test value' rating of 2.5. As far as I can see, this is not an empirical value at all. It seems the authors have just assumed that if people are asked to use a four point scale – 1, 2, 3, 4 – then the average rating will be…2.5. Of course, that is a completely arbitrary assumption. (Consider the question – 'how much would you like to be beaten and robbed today?': would the average response be likely to be nominal mid-point of a ratings scale?) Perhaps if a much wider survey had been undertaken the actual average rating would have been 1.9 0r 2.7 or …

That is even assuming that 'average' is a meaningful concept here. A four point Likert scale is an ordinal scale ('agree' is always less agreement than 'strongly agree' and more than 'disagree') but not a ratio scale (that is, it cannot be assumed that the perceived 'agreement' gap (i) from 'strongly disagree' to 'disagree' is the same for each respondent and the same as that (ii) from 'disagree' to 'agree' and (iii) from 'agree' to 'strongly agree'). Strictly, Likert scale ratings cannot be averaged (better being presented as bar charts showing frequencies of response) – so although the authors carry out a great deal of analysis, much of this is, strictly, invalid.

So what has been found out from this study?

I would very much like to know what peer reviewers made of this study. Expert reviewers would surely have identified some very serious weaknesses in the study and would have been expected to have recommended some quite major revisions even if they thought it might eventually be publishable in a research journal.

An editor is expected to take on board referee evaluations and ask authors to make such revisions as are needed to persuade the editor the submission is ready for publication. It is the job of the editor of a research journal, supported by the peer reviewers, to

a) ensure work of insufficient quality is not published

b) help authors strengthen their paper to correct errors and address weaknesses

Sometimes this process takes some time, with a number of cycles of revision and review. Here, however, the editor was able to move to a decision to publish in 5 days.

The study reflects a substantive amount of work by the authors. Yet, it is hard to see how this study, at least as reported in this journal, makes a substantive contribution to public knowledge. The study finds that one school has somewhat higher survey ratings on an instrument that has not been fully validated than another school, and is based on a pooling of student and teacher perceptions, and which guesses that both rate lower than a hypothetical 'average' school. The two schools were supposed to represent a "different approach[es] of teaching chemistry in practical approach" – but even if that is the case, the authors have not shared with their readers what these different approaches are meant to be. So, there would be no possibility of generalising from the schools to 'approach[es] of teaching chemistry', even if that was logically justifiable. And comparative education it is not.

This study, at least as published, does not seem to offer useful new knowledge to the chemistry education community that could support teaching practice or further research. Even in the very specific context of the two specific schools it is not clear what can be done with the findings which simply reflect back to the informants what they have told the researchers, without exploring the reasons behind the ratings (how do different teachers and students understand what counts as 'Chemicals are arranged in a logical order') or the values the participants are bringing to the study (is 'Check whether the lab report is individual work or group' meant to imply that it is seen as important to ensure that students work cooperatively or to ensure they work independently or …?)

If there is a problem highlighted here by the "very low levels" (based on a completely arbitrary interpretation of the scales) there is no indication of whether this is due to resourcing of the schools, teacher preparation, levels of technician support, teacher attitudes or pedagogic commitments, timetabling problems, …

This seems to be a study which has highlighted two schools, invited teachers and students to complete a dubious questionnaire, and simply used this to arbitrarily characterise the practical chemistry education in the schools as very poor, without contextualising any challenges or offering any advice on how to address the issues.

Work cited:
Note:

1 'Imply' as Yesgat and Yibeltal do not actually state that they have carried out comparative education. However, if they do not think so, then the paragraph on comparative education in their introduction has no clear relationship with the rest of the study and is not more than a gratuitous reference, like suddenly mentioning Nottingham Forest's European Cup triumphs or noting a preferred flavour of tea.


2 This seemed an intriguing segment of the text as it was largely written in a more sophisticated form of English than the rest of the paper, apart from the odd reference to "Most compartivest [comparative education specialists?] states…" which seemed to stand out from the rest of the segment. Yesgat and Yibeltal do not present this as a quote, but cite a source informing their text (their reference [4] :Joubish, 2009). However, their text is very similar to that in another publication:

Quote from Mbozi, 2017, p.21Quote from Yesgat and Yibeltal, 2021, pp.95-96
"One purpose of comparative education is to stimulate critical reflection about our educational system, its success and failures, strengths and weaknesses."One purpose of comparative education is to stimulate critical reflection about our educational system, its success and failures, strengths and weaknesses.
This critical reflection facilitates self-evaluation of our work and is the basis for determining appropriate courses of action.This critical reflection facilitates self-evaluation of our work and is the basis for determining appropriate courses of action.
Another purpose of comparative education is to expose us to educational innovations and systems that have positive outcomes. Another purpose of comparative education is to expose us to educational innovations and systems that have positive outcomes.
The exposure facilitates our adoption of best practices.
Some purposes of comparative education were not covered in your exercise above.
Purposes of comparative education suggested by two authors Noah (1985) and Kidd (1975) are presented below to broaden your understanding of the purposes of comparative education.
Noah, (1985) states that comparative education has four main purposes [4] and these are:Most compartivest states that comparative education has four main purposes. These are:
1. To describe educational systems, processes or outcomes• To describe educational systems, processes or outcomes
2. To assist in development of educational institutions and practices• To assist in development of educational institutions and practices
3. To highlight the relationship between education and society• To highlight the relationship between education and society
4. To establish generalized statements about education, that are valid in more than one country."• To establish generalized statements about education that are valid in more than one country"
Comparing text (broken into sentences to aid comparison) from two sources

3 There are more sophisticated techniques which can be used to check whether items do 'cluster' as expected for a particular sample of respondents.


4 As suggested above, researchers can pilot instruments with interviews or 'think aloud' protocols to check if items are understood as intended. Asking assumed experts to read through and check 'face validity' is of itself quite a limited process, but can be a useful initial screen to identify items of dubious relevance.

Knocking off a quick pharmaceutical intervention

(Or, if not, knocking a pharmaceutical intervention journal)

Keith S. Taber

an International Peer-Reviewed, Multi-disciplinary Scientific Journal (https://www.scriptionpublications.org/journal-details/10/Journal-of-Pharmaceutical-Interventions#)

Yesterday,

Yesterday I was setting up a discrete webpage for characterising predatory journals as my page on 'Journals and poor academic practice' was looking a bit text heavy. I was listing a number of the features that I saw in invitations to publish in journals that seemed to fit the descriptor 'predatory' (after my money, and not really interested in the quality of scholarship they publish).

…today…

As if by magic…

When I turned on the computer this morning I found an email from the Journal of Pharmaceutical Interventions asking me to contribute to the journal. It was almost like they were looking to offer an illustration of several of the features I was highlighting:

  • being in a rush to get submissions (perhaps because they do not seem to have published a single article yet)
  • accepting a wide range of different 'article' types
  • praising my eminence in a field I have never worked in
  • name-checking and claiming to have read something (of little relevance to the inviting journal!) I've published
  • a relatively broad range of topics 1

This does not prove that the journal will not have high editorial standards, but it is not looking promising. 2

…and tomorrow?

I guess I will be pretty busy if I am going to learn enough about the field of Pharmaceutical Interventions to produce something of publishable quality within a week.

Notes

1 The list in the email does not look overly inclusive, but the website (https://www.scriptionpublications.org/journal-details/10/Journal-of-Pharmaceutical-Interventions – accessed 2021-11-18) offers the following list of topics a being within the journal's scope – including some that certainly do not look like pharmaceutical science to me!

  • Analytical Chemistry
  • Bioanalytical Chemistry
  • Bio-Chemical Science
  • Biomedical Engineering
  • Bio-medical Sciences
  • Biopharmaceutics
  • Biopharmaceutics and Pharmacokinetics
  • Clinical and Hospital Pharmacy
  • Computational Chemistry
  • Cosmetics and Neutraceuticals
  • Dental and Medical Sciences
  • Drug Design
  • Drug Development
  • Drug Discovery
  • Drug Regulatory Affairs
  • Drug Targeting
  • Drug-Receptor Interactions
  • Environmental Chemistry
  • Environmental Sciences
  • Fermentation Technology
  • Fisheries and Dairy Science
  • Food and Nutrition Science
  • Genetics and Proteomics
  • Genomics
  • Green Chemistry
  • Health Sciences
  • Herbal Technology
  • Industrial Pharmacy
  • Intellectual property rights in Chemical Sciences
  • Life Sciences
  • Marine Biology
  • Medical Pharma
  • Medication Management
  • Medicinal Chemistry
  • Medicine and Neurobiology
  • Microbiology and Nuclear Pharmacy
  • Molecular Drug Design
  • Nanomedicine
  • Nanotechnology/ nanomedicine
  • Natural Chemistry
  • Natural Product Research
  • Novel Drug Delivery Systems
  • Oncology
  • Patent Laws
  • Pharma Administration
  • Pharma Engineering
  • Pharmaceutical Analysis
  • Pharmaceutical Analysis
  • Pharmaceutical Biotechnology and Microbiology
  • Pharmaceutical Care
  • Pharmaceutical Chemistry
  • Pharmaceutical Formulation
  • Pharmaceutical Public Health
  • Pharmaceutical Sciences
  • Pharmaceutics
  • Pharmacodynamics
  • Pharmacoeconomics
  • Pharmacoepidemiology
  • Pharmacogenetics and Pharmacogenomics
  • Pharmacogenomics
  • Pharmacogenomics and Physiology
  • Pharmacognosy and Ethanobotany
  • Pharmacology and Toxicology
  • Pharmacotherapy
  • Pharmacovigilance
  • Pharmacovigilance
  • Pharmacy Practice and Hospital Pharmacy
  • Physiological and Biochemical Effects of Drugs on the Body
  • Phytochemistry
  • Phytochemistry and QC / QA
  • Phytomedicine
  • Plant pathology and Entomology
  • Polymer Sciences
  • Quality Assurance
  • Regulatory Affairs
  • Soil and Seed Science
  • Synthetic Chemistry

2 The journal claims:

"Every article submitted to our platform is peer-reviewed by a distinguished editorial board and expert reviewers at the same moment, peer reviewers follow rigorous publication ethics thus confirming the article standards of significance and scientific excellence and deliver a quality systematic service to the Authors, Reviewers and Readers throughout the publication process….Every article submitted to the journal is rigorously examined and published only after the acceptance of Editorial Board members."

I would be happy to learn this is so, and that rigorous editorial processes are simply not well reflected by the sloppy direct marketing approach to encouraging submissions. I guess only time will tell.

The mystery of the disappearing authors

Original image by batian lu from Pixabay 

Can an article be simultaneously out of scope, and limited in scope?

Keith S. Taber

Not only had two paragraphs from the abstract gone missing, along with the figures, but the journal article had also lost two-thirds of its authors.

I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality 'predatory journal'. That is, a journal which encourages submissions simply to be able to charge a publication fee (currently $1519, according to the website), without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

Although the journal is called the Journal of Chemistry: Education Research and Practice (not to be confused, even if the publishers would like it to be, with the well-established journal Chemistry Education Research and Practice) only a few of the papers published are actually education studies.

One of the articles that IS on an educational topic is called 'An overview of the first year Undergraduate Medical Students [sic] Feedback on the Point of Care Ultrasound Curriculum' (Mohialdin, 2018a), by Vian Mohialdin, an
Associate Professor of Pathology and Molecular Medicine at McMaster University in Ontario.

A single-authored paper by Prof. Mohialdin

Review articles

Research journals tend to distinguish between different types of articles, and most commonly:

  • papers that report empirical studies,
  • articles which set out theoretical perspectives/positions, and
  • articles that offer reviews of the existing literature on a topic.

'An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum' is classified as a review article.

A review article?

Typically, review articles cite a good deal of previous literature. Prof. Mohialdin cites a modest number of previous publications – just 10. Now one might suspect that perhaps the topic of point-of-care ultrasound in undergraduate medical education is a fairly specialist topic, and perhaps even a novel topic, in which case there may not be much literature to review. But a review of ultrasound in undergraduate medical education published a year earlier (Feilchenfeld, Dornan, Whitehead & Kuper, 2017) cited over a hundred works.

Actually a quick inspection of Mohialdin's paper reveals it is not a review article at all, as it reports a single empirical study. Either the journal has misclassified the article, or the author submitted it as a review article and the journal did not query this. To be fair, the journal website does note that classification into article types "is subjective to some degree". 1

So, is it a good study?

Not a full paper

Well, that is not easy to evaluate as the article is less than two pages in length whereas most research studies in education are much more substantial. Even the abstract of the article seems lacking (see the table below, left hand column). An abstract of a research paper is usually expected to very briefly report something about the research sample/population (who participated in the study?); the research design/methodology (is it an experiment, a survey…), and the results (what did the researchers find out?) The abstract of Prof. Mohialdin's paper misses all these points and so tells readers nothing about the research.

The main text also lacks some key information. The study is a type of research report that is sometimes called a 'practice paper' – the article reports some teaching innovation carried out by practitioners in their own teaching context. The text does give some details of what the practice was – but simply writing about practice is not usually considered sufficient for a research paper. At the least, there needs to be some evaluation of the innovation.

The research design for the evaluation is limited to two sentences under the section heading 'Conclusion/Result Result'. (Mohialdin, 2018a, p.1)

Here there has been some evaluation, but the report is very sketchy, and so might seem inadequate for a research report. Under a rather odd section heading, the reader is informed,

"A questionnaire was handed to the first year undergraduate medical students at the end of session four, to evaluate their hands on ultrasound session experience."

Mohialdin, 2018a, p.1

That one sentence comprises the account of data collection.

The questionnaire is not reproduced for readers. Nor is it described (how many questions, what kinds of questions?) Nor is its development reported. There is not any indication of how many of the 150 students in the population completed the questionnaire, whether ethical procedures were followed 2, where the students completed the questionnaire (for example, was this undertaken in a class setting where participants were being observed by the teaching staff, or did they take it away with them "at the end of session four" to complete in private?) or whether they were able to respond anonymously (rather than have their teachers be able to identify who made which responses).

Perhaps there are perfectly appropriate responses to these questions – but as the journal peer reviewers and editor do not seem to have asked, the reader is left in the dark.

Invisible analytical techniques

Similarly, details of the analysis undertaken are, again, sketchy. A reader is told:

"Answers were collected and data was [sic] analyzed into multiple graphs (as illustrated on this poster)."

Mohialdin, 2018a, p.1

Now that sounds promising, except either the author forgot to submit the graphs with the text, or the journal somehow managed to lose them in production. 3 (And as I've found out, even the most prestigious and well established publishers can lose work they have accepted for publication!)

So, readers are left with no idea what questions were asked, nor what responses were offered, that led to the graphs – that are not provided.

There were also comments – presumably [sic – it would be good to be told] in response to open-ended items on the questionnaire.

"The comments that we [sic, not I] got from this survey were mainly positive; here are a few of the constructive comments that we [sic] received:…

We [sic] also received some comments about recommendations and
ways to improve the sessions (listed below):…"

Mohialdin, 2018a, 1-2.

A reader might ask who decided which comments should be counted as positive (e.g., was it a rater independent of the team who implemented the innovation?), and what does 'mainly' mean here (e.g., 90 of 100 responses? 6 of 11?).

So, in summary, there is no indication of what was asked, who exactly responded, or how the analysis was carried out. As the Journal of Chemistry: Education Research and Practice claims to be a peer reviewed journal one might expect reviewers to have recommended at least that such information (along with the missing graphs) should be included before publication might be considered.

There is also another matter that one would expect peer reviewers, and especially the editor, to have noticed.

Not in scope

Research journals usually have a scope – a range of topics they publish articles on. This is normally made clear in the information on journal websites. Despite its name, the Journal of Chemistry: Education Research and Practice does not restrict itself to chemistry education, but invites work on all aspects of the chemical sciences, and indeed most of its articles are not educational.

Outside the scope of the journal? (Original Image by Magnascan from Pixabay )

But 'An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum' is not about chemistry education or chemistry in a wider sense. Ultrasound diagnostic technology falls under medical physics, not a branch of chemistry. And, more pointedly, teaching medical students to use ultrasound to diagnose medical conditions falls under medical education – as the reference to 'Medical Students' in the article title rather gives away. So, it is odd that this article was published where it was, as it should have been rejected from this particular journal as being out of scope.

Despite the claims of Journal of Chemistry: Education Research and Practice to be a peer reviewed journal (that means that all submissions are supposedly sent out to, and scrutinised and critiqued by, qualified experts on the topic who make recommendations about whether something is sufficient quality for publication, and, if so, whether changes should be made first – like perhaps including graphs that are referred to, but missing), the editor managed to decide the submission should be published just seven days after it was submitted for consideration.

The chemistry journal accepted the incomplete report of the medical education study, to be described as a review article, one week after submission.

The journal article as a truncated conference poster?

The reference to "multiple graphs (as illustrated on this poster)" (my emphasis) suggested that the article was actually the text (if not the figures) of a poster presented at a conference, and a quick search revealed that Mohialdin, Wainman and Shali had presented on 'An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum' at an experimental biology (sic, not chemistry) conference.

A poster at a conference is not considered a formal publication, so there is nothing inherently wrong with publishing the same material in a journal – although often posters report either quite provisional or relatively inconsequential work so it is unusual for the text of a poster to be considered sufficiently rigorous and novel to justify appearing in a research journal in its original form. It is notable that despite being described by Prof. Mohialdin as a 'preliminary' study, the journal decided it was of publishable quality.

Although norms vary between fields, it is generally the case that a conference poster is seen as something quite different from a journal article. There is a limited amount of text and other material that can be included on a poster if it is to be readable. Conferences often have poster sessions where authors are invited to stand by their poster and engage with readers – so anyone interested can ask follow-up questions to supplement the often limited information given on the poster itself.

By contrast, a journal article has to stand on its own terms (as the authors cannot be expected to pop round for a conversation when you decide to read it). It is meant to present an argument for some new knowledge claim(s): an argument that depends on the details of the research conceptualisation, design, and data analysis. So what may seem as perfectly adequate in a poster may well not be sufficient to satisfy journal peer review.

The abstract of the conference poster was published in a journal (Mohialdin, Wainman & Shali, 2018) and I have reproduced that abstract in the table below, in the right hand column.


Mohialdin, 2018a
(Journal paper)
Mohialdin, Wainman & Shali, 2018
(Conference poster)
With the technological progress of different types of portable Ultrasound machines, there is a growing demand by all health care providers to perform bedside Ultrasonography, also known as Point of Care Ultrasound (POCUS). This technique is becoming extremely useful as part of the Clinical Skills/Anatomy teaching in the undergraduate Medical School Curriculum.With the technological progress of different types of portable Ultrasound machines, there is a growing demand by all health care providers to perform bedside Ultrasonography, also known as Point of Care Ultrasound (POCUS). This technique is becoming extremely useful as part of the Clinical Skills/Anatomy teaching in the undergraduate Medical School Curriculum.
Teaching/training health care providers how to use these portable Ultrasound machines can complement their physical examination findings and help in a more accurate diagnosis, which leads to a faster and better improvement in patient outcomes. In addition, using portable Ultrasound machines can add more safety measurements to every therapeutic/diagnostic procedure when it is done under an Ultrasound guide. It is also considered as an extra tool in teaching Clinical Anatomy to Medical students. Using an Ultrasound is one of the different imaging modalities that health care providers depend on to reach their diagnosis, while also being the least invasive method.Teaching/training health care providers how to use these portable Ultrasound machines can complement their physical examination findings and help in a more accurate diagnosis, which leads to a faster and better improvement in patient outcomes. In addition, using portable Ultrasound machines can add more safety measurements to every therapeutic/diagnostic procedure when it is done under an Ultrasound guide. It is also considered as an extra tool in teaching Clinical Anatomy to Medical students. Using an Ultrasound is one of the different imaging modalities that health care providers depend on to reach their diagnosis, while also being the least invasive method.
We thought investing in training the undergraduate Medical students on the basic Ultrasound scanning skills as part of their first year curriculum will help build up the foundation for their future career.We thought investing in training the undergraduate Medical students on the basic Ultrasound scanning skills as part of their first year curriculum will help build up the foundation for their future career.
The research we report in this manuscript is a preliminary qualitative study. And provides the template for future model for teaching a hand on Ultrasound for all health care providers in different learning institutions.
A questionnaire was handed to the first year medical students to evaluate their hands on ultrasound session experience. Answers were collected and data was [sic] analyzed into multiple graphs.
Abstracts from Mohialdin's paper, plus the abstract from co-authored work presented at the Experimental Biology 2018 Meeting according to the journal of the Federation of American Societies for Experimental Biology. (See note 4 for another version of the abstract.)

The abstract includes some very brief information about what the researchers did (which is strangely missing from the journal article's abstract). Journals usually put limits on the word count for abstracts. Surely the poster's abstract was not considered too long for the journal, so someone (the author? the editor?) simply dropped the final two paragraphs – that is, arguably the two most relevant paragraphs for readers?

The lost authors?

Not only had two paragraphs from the abstract gone missing, along with the figures, but the journal article had also lost two-thirds of its authors.

A poster with multiple authors

Now in the academic world authorship of research reports is not an arbitrary matter (Taber, 2018). An author is someone who has made a substantial intellectual contribution to the work (regardless of how much of the writing-up they undertake, or whether they are present when work is presented at a conference). That is a simple principle, which unfortunately may lead to disputes as it needs to be interpreted when applied; but, in most academic fields, there are conventions regarding what kind of contribution is judged significant and substantive enough for authorship

It may well be that Prof. Mohialdin was the principal investigator on this study and that the contributions of Prof. Wainman and Prof. Shali were more marginal, and so it was not obvious whether or not they should be considered authors when reporting the study. But it is less easy to see how they qualified for authorship on the poster but not on the journal article with the same title which seems (?) to be the text of the poster (i.e., describes itself as being the poster). [It is even more difficult to see how they could be authors of the poster when it was presented at one conference, but not when it was presented somewhere else. 4]

Of course, one trivial suggestion might be to suggest that Wainman and Shali contributed the final two paragraphs of the abstract, and the graphs, and that without these the – thus reduced – version in the journal only deserved one author according to the normal academic authorship conventions. That is clearly not an acceptable rationale as academic studies have to be understood more holistically than that!

Perhaps Wainman and Shali asked to have their names left off the paper as they did not want to be published in a journal of chemistry that would publish a provisional and incomplete account of a medical education practice study classified as a review article. Maybe they suspected that this would hardly enhance their scholarly reputations?

Work cited:
  • Feilchenfeld, Z., Dornan, T., Whitehead, C., & Kuper, A. (2017). Ultrasound in undergraduate medical education: a systematic and critical review. Medical Education. 51: 366-378. doi: 10.1111/medu.13211
  • Mohialdin, V. (2018a) An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum. Journal of Chemistry: Education Research and Practice, 2 (2), 1-2.
  • Mohialdin, V. (2018b). An overview of the first year undergraduate medical students feedback on the point of care ultrasound curriculum. Journal of Health Education Research & Development, 6, 30.
  • Mohialdin, V., Wainman, B. & Shali, A. (2018) An overview of the first year Undergraduate Medical Students Feedback on the Point of Care Ultrasound Curriculum. The FASIB Journal. 32 (S1: Experimental Biology 2018 Meeting Abstracts), 636.4
  • Taber, K. S. (2013). Classroom-based Research and Evidence-based Practice: An introduction (2nd ed.). London: Sage.
  • Taber, K. S. (2018). Assigning Credit and Ensuring Accountability. In P. A. Mabrouk & J. N. Currano (Eds.), Credit Where Credit Is Due: Respecting Authorship and Intellectual Property (Vol. 1291, pp. 3-33). Washington, D.C.: American Chemical Society. [The publisher appears to have made this open access]

Footnotes:

1 The following section appears as part of the instructions for authors:

"Article Types

Journal of Chemistry: Education Research and Practice accepts Original Articles, Review, Mini Review, Case Reports, Editorial, and Letter to the Editor, Commentary, Rapid Communications and Perspectives, Case in Images, Clinical Images, and Conference Proceedings.

In general the Manuscripts are classified in to following [sic] groups based on the criteria noted below [I could not find these]. The author(s) are encouraged to request a particular classification upon submitting (please include this in the cover letter); however the Editor and the Associate Editor retain the right to classify the manuscript as they see fit, and it should be understood by the authors that this process is subjective to some degree. The chosen classification will appear in the printed manuscript above the manuscript title."

https://opastonline.com/journal/journal-of-chemistry-education-research-and-practice/author-guidelines

2 The ethical concerns in this kind of research are minimal, and in an area like medical education one might feel there is a moral imperative for future professionals to engage in activities to innovate and to evaluate such innovations. However, there is a general principle that all participants in research should give voluntary, informed consent.

(Read about Research Ethics here).

According to the policy statement on the author's (/authors'?) University's website (Research involving human participants, Sept. 2002) at the time of this posting (November, 2021) McMaster University "endorses the ethical principles cited in the Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (1998)".

According to Article 2.1 of that document, Research Ethics Board Review is required for any research involving "living human participants". There are some exemptions, including (Article 2.5): "Quality assurance and quality improvement studies, program evaluation activities, and performance reviews, or testing within normal educational requirements when used exclusively for assessment, management or improvement purposes" (my emphasis).

My reading then is that this work would not have been subject to requiring approval following formal ethical review if it had been exclusively used for internal purposes, but that publication of the work as research means it should have been subject to Research Ethics Board Review before being carried out. This is certainly in line with advice to teachers who invite their own students to participate in research into their teaching that may be reported later (in a thesis, at a conference, etc.) (Taber, 2013, pp.244-248).


3 Some days ago, I wrote to the Journal of Chemistry: Education Research and Practice (in reply to an invitation to publish in the journal), with a copy of the email direct to the editor, asking where I could find the graphs referred to in this paper, but have not yet had a response. If I do get a reply I will report this in the comments below.


4 Since drafting this post, I have found another publication with the same title published in an issue of another journal reporting conference proceedings (Mohialdin, 2018b):

A third version of the publication (Mohialdin, 2018b).

The piece begins with the same material as in the table above. It ends with the following account of empirical work:

A questionnaire was handed to the first year undergraduate medical students at the end of session four, to evaluate their hands on ultrasound session experience. Answers were collected and data was [sic] analyzed into multiple graphs. The comments that we [sic] got from this survey were mainly positive; here are a few of the constructive comments that we [sic] received: This was a great learning experience; it was a great learning opportunity; very useful, leaned [sic] a lot; and loved the hand on experience.

Mohialdin, 2018b, p.30

There is nothing wrong with the same poster being presented at multiple conferences and this is quite a common academic strategy. Mohialdin (2018b) reports from a conference in Japan, whereas Mohialdin, Wainman, Shali (2018) refers to a US meeting – but it is not clear why the author list is different as the two presentations would seem to report the same research – indeed, it seems reasonable to assume from the commonality of Mohialdin, 2018b) with Mohialdin, Wainman, Shali, 2018 that they are the same report (poster).

Profs. Wainman and Shali should be authors of any report of this study if, and only if, they made substantial intellectual contributions to the work reported – and, surely, either they did, or they did not.

Not motivating a research hypothesis

A 100% survey return that represents 73% (or 70%, or perhaps 48%) of the population

Keith S. Taber

…the study seems to have looked for a lack of significant difference regarding a variable which was not thought to have any relevance…

This is like hypothesising…that the amount of alkali needed to neutralise a certain amount of acid will not depend on the eye colour of the researcher; experimentally confirming this is the case; and then seeking to publish the results as a new contribution to knowledge.

…as if a newspaper headline was 'Earthquake latest' and then the related news story was simply that, as usual, no earthquakes had been reported.

Structuring a research report

A research report tends to have a particular kind of structure. The first section sets out background to the study to be described. Authors offer an account of the current state of the relevant field – what can be called a conceptual framework.

In the natural sciences it may be that in some specialised fields there is a common, accepted way of understanding that field (e.g., the nature of important entities, the relevant variables to focus on). This has been described as working within an established scientific 'paradigm'. 1 However, social phenomena (such as classroom teaching) may be of such complexity that a full account requires exploration at multiple levels, with a range of analytical foci (Taber, 2008). 2 Therefore the report may indicate which particular theoretical perspective (e.g., personal constructivism, activity theory, Gestalt psychology, etc.) has informed the study.

This usually leads to one or more research questions, or even specific hypotheses, that are seen to be motivated by the state of the field as reflected in the authors' conceptual framework.

Next, the research design is explained: the choice of methodology (overall research strategy), the population being studied and how it was sampled, the methods of data collection and development of instruments, and choice of analytical techniques.

All of this is usually expected before any discussion (leaving aside a short statement as part of the abstract) of the data collected, results of analysis, conclusions and implications of the study for further research or practice.

There is a logic to designing research. (Image after Taber, 2014).

A predatory journal

I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality 'predatory journal'. That is, a journal which encourages submissions simply to be able to charge a publication fee (currently $1519, according to the website), without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

Although the journal is called the Journal of Chemistry: Education Research and Practice (not to be confused, even if the publishers would like it to be, with the well-established journal Chemistry Education Research and Practice) only a few of the papers published are actually education studies. One of the articles that IS on an educational topic is called 'Students' Perception of Chemistry Teachers' Characteristics of Interest, Attitude and Subject Mastery in the Teaching of Chemistry in Senior Secondary Schools' (Igwe, 2017).

A research article

The work of a genuine academic journal

A key problem with predatory journals is that because their focus is on generating income they do not provide the service to the community expected of genuine research journals (which inevitably involves rejecting submissions, and delaying publication till work is up to standard). In particular, the research journal acts as a gatekeeper to ensure nonsense or seriously flawed work is not published as science. It does this in two ways.

Discriminating between high quality and poor quality studies

Work that is clearly not up to standard (as judged by experts in the field) is rejected. One might think that in an ideal world no one is going to send work that has no merit to a research journal. In reality we cannot expect authors to always be able to take a balanced and critical view of their own work, even if we would like to think that research training should help them develop this capacity.

This assumes researchers are trained, of course. Many people carrying out educational research in science teaching contexts are only trained as natural scientists – and those trained as researchers in natural science often approach the social sciences with significant biases and blind-spots when carrying out research with people. (Watch or read 'Why do natural scientists tend to make poor social scientists?')

Also, anyone can submit work to a research journal – be they genius, expert, amateur, or 'crank'. Work is meant to be judged on its merits, not by the reputation or qualifications of the author.

De-bugging research reports – helping authors improve their work

The other important function of journal review is to identify weaknesses and errors and gaps in reports of work that may have merit, but where these limitations make the report unsuitable for publication as submitted. Expert reviewers will highlight these issues, and editors will ensure authors respond to the issues raised before possible publication. This process relies on fallible humans, and in the case of reviewers usually unpaid volunteers, but is seen as important for quality control – even if it not a perfect system. 3

This improvement process is a 'win' all round:

  • the quality of what is published is assured so that (at least most) published studies make a meaningful contribution to knowledge;
  • the journal is seen in a good light because of the quality of the research it publishes; and
  • the authors can be genuinely proud of their publications which can bring them prestige and potentially have impact.

If a predatory journal which claims (i) to have academic editors making decisions and (ii) to use peer review does not rigorously follow proper processes, and so publishes (a) nonsense as scholarship, and (b) work with major problems, then it lets down the community and the authors – if not those making money from the deceit.

The editor took just over a fortnight to arrange any peer review, and come to a decision that the research report was ready for publication

Students' perceptions of chemistry teachers' characteristics

There is much of merit in this particular research study. Dr Iheanyi O. Igwe explains why there might be a concern about the quality of chemistry teaching in the research context, and draws upon a range of prior literature. Information about the population (the public secondary schools II chemistry students in Abakaliki Education Zone of Ebonyi State) and the sample is provided – including how the sample, of 300 students at 10 schools, was selected.

There is however an unfortunate error in characterising the population:

"the chemistry students' population in the zone was four hundred and ten (431)"

Igwe, 2017, p.8

This seems to be a simple typographic error, but the reader cannot be sure if this should read

  • "…four hundred and ten (410)" or
  • "…four hundred and thirty one (431)".

Or perhaps neither, as the abstract tells readers

"From a total population of six hundred and thirty (630) senior secondary II students, a sample of three hundred (300) students was used for the study selected by stratified random sampling technique."

Igwe, 2017, abstract

Whether the sample is 300/410 or 300/431 or even 300/630 does not fundamentally change the study, but one does wonder how these inconsistencies were not spotted by the editor, or a peer reviewer, or someone in the production department. (At least, one might wonder about this if one had not seen much more serious failures to spot errors in this journal.) A reader could wonder whether the presence of such obvious errors may indicate a lack of care that might suggest the possibility of other errors that a reader is not in a position to spot. (For example, if questionnaire responses had not been tallied correctly in compiling results, then this would not be apparent to anyone who did not have access to the raw data to repeat the analysis.) The author seems to have been let down here.

A multi-scale instrument

The final questionnaire contained 5 items on each of three scales

  • students' perception of teachers' interest in the teaching of chemistry;
  • students' perception of teachers' attitude towards the teaching of chemistry;
  • students' perception of teachers' mastery of the subject in the teaching of chemistry

Igwe informs readers that,

"the final instrument was tested for reliability for internal consistency through the Cronbach Alpha statistic. The reliability index for the questionnaire was obtained as 0.88 which showed that the instrument was of high internal consistency and therefore reliable and could be used for the study"

Igwe, 2017, p.4

This statistic is actually not very useful information as one would want to know about the internal consistency within the scales – an overall value across scales is not informative (conceptually, it is not clear how it should be interpreted – perhaps that the three scales are largely eliciting much the same underlying factor? ) (Taber, 2018). 4

There are times when aggregate information is not very informative (Image by Syaibatul Hamdi from Pixabay )

Again, one might have hoped that expert reviewers would have asked the author to quote the separate alpha values for the three scales as it is these which are actually informative.

The paper also offers a detailed account of the analysis of the data, and an in-depth discussion of the findings and potential implications. This is a serious study that clearly reflects a lot of work by the researcher. (We might hope that could be taken for granted when discussing work published in a 'research journal', but sadly that is not so in some predatory journals.) There are limitations of course. All research has to stop somewhere, and resources and, in particular, access opportunities are often very limited. One of these limitations is the wider relevance of the population sampled.

But do the results apply in Belo Horizonte?

This is the generalisation issue. The study concerns the situation in one administrative zone within a relatively small state in South East Nigeria. How do we know it has anything useful to tell us about elsewhere in Nigeria, let alone about the situation in Mexico or Vietnam or Estonia? Even within Ebonyi State, the Abakaliki Education Zone (that is, the area of the state capital) may well be atypical – perhaps the best qualified and most enthusiastic teachers tend to work in the capital? Perhaps there would have been different findings in a more rural area?

Yet this is a limitation that applies to a good deal of educational research. This goes back to the complexity of educational phenomena. What you find out about an electron or an oxidising agent studied in Abakaliki should apply in Cambridge, Cambridgeshire or equally in Cambridge, Massachusetts. That cannot be claimed about what you may find out about a teacher in Abakaliki, or a student, a class, a school, a University

Misleading study titles?

Educational research studies often have strictly misleading titles – or at least promise a lot more than they deliver. This may in part be authors making unwarranted assumptions, or it may be journal editors wanting to avoid unwieldy titles.

"This situation has inadvertently led to production of half backed graduate Chemistry educators."

Igwe, 2017, p.2

The title of this study does suggest that the study concerns perceptions of Chemistry Teachers' Characteristics …in Senior Secondary Schools, when we cannot assume that chemistry teachers in the Abakaliki Education Zone of Ebonyi State can stand for chemistry teachers more widely. Indeed some of the issues raised as motivating the need for the study are clearly not issues that would apply in all other educational contexts – that is the 'situation', which is said to be responsible for the "production of half backed [half-baked?] graduate Chemistry educators" in Nigeria, will not apply everywhere. Whilst the title could be read as promising more general findings than were possible in the study, Igwe's abstract is quite explicit about the specific population sampled.

A limited focus?

Another obvious limitation is that whilst pupils' perceptions of their teachers are very important, it does not offer a full picture. Pupils may feel the need to give positive reviews, or may have idealistic conceptions. Indeed, assuming that voluntary, informed consent was given (which would mean that students knew they could decline to take part in the research without fear of sanctions) it is of note that every one of the 30 students targeted in each of the ten schools agreed to complete the survey,

"The 300 copies of the instrument were distributed to the respondents who completed them for retrieval on the spot to avoid loss and may be some element of bias from the respondents. The administration and collection were done by the researcher and five trained research assistants. Maximum return was made of the instrument."

Igwe, 2017, p.4

To get a 100% return on a survey is pretty rare, and if normal ethical procedures were followed (with the voluntary nature of the activity made clear) then this suggests these students were highly motivated to appease adults working in the education system.

But we might ask how student perceptions of teacher characteristics actually relate to teacher characteristics?

For example, observations of the chemistry classes taught by these teachers could possibly give a very different impression of those teachers than that offered by the student ratings in the survey. (Another chemistry teacher may well be able to distinguish teacher confidence or bravado from subject mastery when a learner is not well placed to do so.) Teacher self-reports could also offer a different account of their 'Interest, Attitude and Subject Mastery', as could evaluations by their school managers. Arguably, a study that collected data from multiple sources would offer the possibility of 'triangulating' between sources.

However, Igwe, is explicit about the limited focus of the study, and other complementary strands of research could be carried out to follow-up on the study. So, although the specific choice of focus is a limitation, this does not negate the potential value of the study.

Research questions

Although I recognise a serious and well-motivated study, there is one aspect of Igwe's study which seemed rather bizarre. The study has three research questions (which are well-reflected in the title of the study) and a hypothesis which I suspect will likely surprise some readers.

That is not a good thing. At least, I always taught research students that unlike in a thriller or 'who done it?' story, where a surprise may engage and amuse a reader, a research report or thesis is best written to avoid such surprises. The research report is an argument that needs to flow though the account – if a reader is surprised at something the researcher reports doing then the author has probably forgotten to properly introduce or explain something earlier in the report.

Here are the research questions and hypotheses:

"Research Questions

The following research questions guided the study, thus:

How do students perceive teachers' interest in the teaching of chemistry?

How do students perceive teachers' attitude towards the teaching of chemistry?

How do students perceive teachers' mastery of the subjects in the teaching of chemistry?

Hypotheses
The following null hypothesis was tested at 0.05 alpha levels, thus:
HO1 There is no significant difference in the mean ratings of male and female students on their perception of chemistry teachers' characteristics in the teaching of chemistry."

Igwe, 2017, p.3

A surprising hypothesis?

A hypothesis – now where did that come from?

Now, I am certainly not criticising a researcher for looking for gender differences in research. (That would be hypocritical as I looked for such differences in my own M.Sc. thesis, and published on gender differences in teacher-student interactions in physics classes, gender differences in students' interests in different science topics on stating secondary school, and links between pupil perceptions of (i) science-relatedness and (ii) gender-appropriateness of careers.)

There might often be good reasons in studies to look for gender differences. But these reasons should be stated up-front. As part of the conceptual framework motivating the study, researchers should explain that based on their informal observations, or on anecdotal evidence, or (better) drawing upon explicit theoretical considerations, or that informed by the findings of other related studies – or whatever reason there might – there are good reasons to check for gender differences.

The flow of research (Underlying image from Taber, 2013) The arrows can be read as 'inform(s)'.

Perhaps Igwe had such reasons, but there seems to be no mention of 'gender' as a relevant variable prior to the presentation of the hypothesis: not even a concerning dream, or signs in the patterns of tea leaves. 5 To some extent, this is reinforced by the choice of the null hypothesis – that no such difference will be found. Even if it makes no substantive difference to a study whether a hypothesis is framed in terms of there being a difference or not, psychologically the study seems to have looked for a lack of significant difference regarding a variable which was not thought to have any relevance.

Misuse of statistics

It is important for researchers not to test for effects that are not motivated in their studies. Statistical significance tells a researcher something is unlikely to happen just by chance – but it still might. Just as someone buying a lottery ticket is unlikely to win the lottery – but they might. Logically a small proportion of all the positive statistical results in the literature are 'false positives' because unlikely things do happen by chance – just not that often. 6 The researcher should not (metaphorically!) go round buying up lots of lottery tickets, and then seeing an occasional win as something more than chance.

No alarms and no surprises

And what was found?

"From the result of analysis … the null hypothesis is accepted which means that there is no significant difference in the mean ratings of male and female students in their perception of chemistry teachers' characteristics (interest, attitude and subject mastery) in the teaching of chemistry."

Igwe, 2017, p.6

This is like hypothesising, without any motivation, that the amount of alkali needed to neutralise a certain amount of acid will not depend on the eye colour of the researcher; experimentally confirming this is the case; and then seeking to publish the results as a new contribution to knowledge.

Why did Igwe look for gender difference (or more strictly, look for no gender difference)?

  • A genuine relevant motivation missing from the paper?
  • An imperative to test for something (anything)?
  • Advice that journals are more likely to publish studies using statistical testing?
  • Noticing that a lot of studies do test for gender differences (whether there seems a good reason to do so or not)?

This seems to be an obvious point for peer reviewers and the editor to raise: asking the author to either (a) explain why it makes sense to test for gender differences in this study – or (b) to drop the hypothesis from the paper. It seems they did not notice this, and readers are simply left to wonder – just as you would if a newspaper headline was 'Earthquake latest' and then the related news story was simply that, as usual, no earthquakes had been reported.

Work cited:


Footnotes:

1 The term paradigm became widely used in this sense after Kuhn's (1970) work although he later acknowledged criticisms of the ambiguous way he used the term, in particular as learning about a field through working through standard examples, paradigms, and the wider set of shared norms and values that develop in an established field which he later termed 'disciplinary matrix'. In psychology research 'paradigm' may be used in the more specific sense of an established research design/protocol.


2 There are at least three ways of explaining why a lot of research in the social science seems more chaotic and less structured to outsiders than most research in the natural sciences.

  • a) Ontology. Perhaps the things studied in the natural sciences really exist, and some of those in the social sciences are epiphenomena and do not reflect fundamental, 'real', things. There may be some of that sometimes, but if so I think it is a matter of degree (that is, scientists have not been beyond studying the ether or phlogiston), because of the third option (c).
  • b) The social sciences are not as mature as many areas of the natural sciences and so are sill 'pre-paradigmatic'. I am sure there is sometimes an element of this: any new field will take time to focus in on reliable and productive ways of making sense of its domain.
  • c) The complexity of the phenomena. Social phenomena are inherently more complex, often involving feedback loops between participants' behaviours and feelings and beliefs (including about the research, the researcher, etc.)

Whilst (a) and (b) may sometimes be pertinent, I think (c) is often especially relevant to this question.


3 An alternative approach that has gained some credence is to allow authors to publish, but then invite reader reviews which will also be published – and so allowing a public conversation to develop so readers can see the original work, criticism, responses to those criticisms, and so forth, and make their own judgements. To date this has only become common practice in a few fields.

Another approach for empirical work is for authors to submit research designs to journals for peer review – once a design has been accepted by the journal, the journal agrees to publish the resulting study as long as the agreed protocol has been followed. (This is seen as helping to avoid the distorting bias in the literature towards 'positive' results as studies with 'negative' results may seem less interesting and so less likely to be accepted in prestige journals.) Again, this is not the norm (yet) in most fields.


4 The statistic has a maximum value of 1, which would indicate that the items were all equivalent, so 0.88 seems a high value, till we note that a high value of alpha is a common artefact of including a large number of items.

However, playing Devil's advocate, I might suggest that the high overall value of alpha could suggest that the three scales

  • students' perception of teachers' interest in the teaching of chemistry;
  • students' perception of teachers' attitude towards the teaching of chemistry;
  • students' perception of teachers' mastery of the subject in the teaching of chemistry

are all tapping into a single underlying factor that might be something like

  • my view of whether my chemistry teacher is a good teacher

or even

  • how much I like my chemistry teacher

5 Actually the discrimination made is between male and female students – it is not clear what question students were asked to determine 'gender', and whether other response options were available, or whether students could decline to respond to this item.


6 Our intuition might be that only a small proportion of reported positive results are false positives, because, of course, positive results reflect things unlikely to happen by chance. However if, as is widely believed in many fields, there is a bias to reporting positive results, this can distort the picture.

Imagine someone looking for factors that influence classroom learning. Consider that 50 variables are identified to test, such as teacher eye colour, classroom wall colour, type of classroom window frames, what the teacher has for breakfast, the day of the week that the teacher was born, the number of letters in the teacher's forename, the gender of the student who sits nearest the fire extinguisher, and various other variables which are not theoretically motivated to be considered likely to have an effect. With a confidence level of p[robability] ≤ 0.05 it is likely that there will be a very small number of positive findings JUST BY CHANCE. That is, if you look across enough unlikely events, it is likely some of them will happen. There is unlikely to be a thunderstorm on any particular day. Yet there will likely be a thunderstorm some day in the next year. If a report is written and published which ONLY discusses a positive finding then the true statistical context is missing, and a likely situation is presented as unlikely to be due to chance.


Move over Mendeleev, here comes the new Mendel

Seeking the islets of Filipenka Henadzi


Keith S. Taber


"new chemical elements with atomic numbers 72-75 and 108-111 are supposedly revealed, and also it is shown that for heavy elements starting with hafnium, the nuclei of atoms contain a larger number of protons than is generally accepted"

Henadzi, 2019, p.2

Somehow I managed to miss a 2019 paper bringing into doubt the periodic table that is widely used in chemistry. It was suggested that many of the heavier elements actually have higher atomic numbers (proton numbers) than had long been assumed, with the consequence that when these elements are correctly re-positioned it reveals two runs of elements that should be in the periodic table, but which till now have not been identified by chemists.

According to Henadzi we need to update the periodic table and look for eight missing elements (original image by Image by Gerd Altmann from Pixabay)

Henadzi (2019) suggests that "I would like to name groups of elements with the numbers 72-75 and 108-111 [that is, those not yet identified that should have these numbers], the islets of Filipenka Henadzi."

The orginal Mendeleev

This is a bit like being taken back to when Dmitri Mendeleev first proposed his periodic table and had the courage to organise elements according to patterns in their properties, even though this left gaps that Mendeleev predicted would be occupied by elements yet to be discovered. The success of (at least some) of his predictions is surely the main reason why he is considered the 'father' of the periodic table, even though others were experimenting with similar schemes.

Now it has been suggested that we still have a lot of work to do to get the periodic table right, and that the version that chemists have used (with some minor variations) for many decades is simply wrong. This major claim (which would surely be considered worthy of the Nobel prize if found correct) was not published in Nature or Science or one of the prestigious chemistry journals published by learned societies such as the Royal Society of Chemistry, but in an obscure journal that I suspect many chemists have never heard of.

The original Mendel

This is reminiscent of the story of Mendel's famous experiments with inheritance in pea plants. Mendel's experiments are now seen as seminal in establishing core ideas of genetics. But Mendel's research was ignored for many years.

He presented his results at meetings of the Natural History Society of Brno in 1865 and then published them in a local German language journal – and his ideas were ignored. Only after other scientists rediscovered 'his' principles in 1900, long after his death, was his work also rediscovered.

Moreover, the discussion of this major challenge to accepted chemistry (and physics if I have understood the paper) is buried in an appendix of a paper which is mostly about the crystal structures of metals. It seems the appendix includes a translation of work previously published in Russian, explaining why, oddly, a section part way through the appendix begins "This article sets out the views on the classification of all known chemical elements, those fundamental components of which the Earth and the entire Universe consists".

Calling out 'predatory' journals

I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality 'predatory journal'. That is, a journal which encourages submissions simply to be able to charge a publication fee (currently $1519, according to the website), without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

One of the papers I decided to read, partly because the topic looked of particular interest, was 'Nature of Chemical Elements' (Henadzi, 2019). Most of the paper is concerned with the crystal structures of metals, and presenting a new model to explain why metals have the structure they do. This is related to the number of electrons per atom that can be considered to be in the conduction band – something that was illustrated with a simple diagram that unfortunately, to my reading at least, was not sufficiently elaborated.1

The two options referred to seem to refer to n-type (movement of electrons) and p-type (movement of electrons that can be conceptualised as movement of a {relatively} positive hole, as in semi-conductor materials) – Figure 1 from Henadzi, 2019: p2

However, what really got my attention was the proposal for revising the periodic table and seeking eight new elements that chemists have so far missed.

Beyond Chadwick

Henadzi tells readers that

"The innovation of this work is that in the table of elements constructed according to the Mendeleyev's law and Van-den- Broek's rule [in effect that atomic number in the periodic table = proton number], new chemical elements with atomic numbers 72-75 and 108-111 are supposedly revealed, and also it is shown that for heavy elements starting with hafnium, the nuclei of atoms contain a larger number of protons than is generally accepted. Perhaps the mathematical apparatus of quantum mechanics missed some solutions because the atomic nucleus in calculations is taken as a point."

Henadzi, 2019, p.4

Henadzi explains

"When considering the results of measuring the charges of nuclei or atomic numbers by James Chadwick, I noticed that the charge of the core of platinum is rather equal not to 78, but to 82, which corresponds to the developed table. For almost 30 years I have raised the question of the repetition of measurements of the charges of atomic nuclei, since uranium is probably more charged than accepted, and it is used at nuclear power plants."

Henadzi, 2019, p.4

Now Chadwick is most famous for discovering the neutron – back in 1932. So he was working a long time ago, when atomic theory was still quite underdeveloped and with apparatus that would seem pretty primitive compared with the kinds of set up used today to investigate the fundamental structure of matter. That is, it is hardly surprising if his work which was seminal nearly a century ago had limitations. Henadzi however seems to feel that Chadwick's experiments accurately reveal atomic numbers more effectively than had been realised.

Sadly, Henadzi does not cite any specific papers by Chadwick in his reference list, so it is not easy to look up the original research he is discussing. But if Henadzi is suggesting that data produced almost a century ago can be interpreted as giving some elements different atomic numbers to those accepted today, the obvious question is what other work, since, establishes the accepted values, and why should it not be trusted. Henadzi does not discuss this.

Explaining a long-standing mystery

Henadzi points out that whereas for the lighter elements the mass number is about twice the atomic number (that is, the number of neutrons in a nucleus approximately matches the number of protons) as one proceeds through the period table this changes such the ratio of protons:neutrons shifts to give an increasing excess of neutrons. Henadzi also implies that this is a long standing mystery, now perhaps solved.

"Each subsequent chemical element is different from the previous in that in its core the number of protons increases by one, and the number of neutrons increases, in general, several. In the literature this strange ratio of the number of neutrons to the number of protons for any the kernel is not explained. The article proposes a model nucleus, explaining this phenomenon."

Henadzi, 2019, p.5

Now what surprised me here was not the pattern itself (something taught in school science) but the claim that the reason was not known. My, perhaps simplistic, understanding is that protons repel each other because of their similar positive electrical charges, although the strong nuclear force binds nucleons (i.e., protons and neutrons collectively) into nuclei and can overcome this.

Certainly what is taught in schools is that as the number of protons increases more neutrons are needed to be mixed in to ensure overall stability. Now I am aware that this is very much an over-simplification, what we might term a curriculum model or teaching model perhaps, but what Henadzi is basically suggesting seems to be this very point, supplemented by the idea that as the protons repel each other they are usually found at the outside of the nucleus alongside an equal number of neutrons – with any additional neutrons within.

The reason for not only putting protons on the outer shell of a large nucleus in Henadzi's model seems to relate to the stability of alpha particles (that is, clumps of two protons and two neutrons, as in the relatively stable helium nucleus). Or, at least, that was my reading of what is being suggested,

"For the construction of the [novel] atomic nucleus model, we note that with alpha-radioactivity of the helium nucleus is approximately equal to the energy.

Therefore, on the outer layer of the core shell, we place all the protons with such the same number of neutrons. At the same time, on one energy Only bosons can be in the outer shell of the alpha- particle nucleus and are. Inside the Kernel We will arrange the remaining neutrons, whose task will be weakening of electrostatic fields of repulsion of protons."

Henadzi, 2019, p.5

The lack of proper sentence structure does not help clarify the model being mooted.

Masking true atomic number

Henadzi's hypothesis seems to be that when protons are on the surface of the nucleus, the true charge, and so atomic number, of an element can be measured. But sometimes with heavier elements some of the protons leave the surface for some reason and move inside the nucleus where their charge is somehow shielded and missed when nuclear charge is measured. This is linked to the approximation of assuming that the charge on an object measured from the outside can be treated as a point charge.

This is what Henadzi suggests:

"Our nuclear charge is located on the surface, since the number of protons and the number of neutrons in the nucleus are such that protons and neutrons should be in the outer layer of the nucleus, and only neutrons inside, that is, a shell forms on the surface of the nucleus. In addition, protons must be repelled, and also attracted by an electronic fur coat. The question is whether the kernel can be considered a point in the calculations and up to what times? And the question is whether and when the proton will be inside the nucleus….if a proton gets into the nucleus for some reason, then the corresponding electron will be on the very 'low' orbit. Quantum mechanics still does not notice such electrons. Or in other words, in elements 72-75 and 108-111, some protons begin to be placed inside the nucleus and the charge of the nucleus is screened, in calculations it cannot be taken as a point."

Henadzi, 2019, p.5

So, I think Henadzi is suggesting that if a proton gets inside the nucleus, its associated electron is pulled into a very close orbit such that what is measured as nuclear charge is the real charge on the nucleus (the number of protons) partially cancelled by low lying electrons orbiting so close to the nucleus that they are within what we might call 'the observed nucleus'.

This has some similarity to the usual idea of shielding that leads to the notion of core charge. For example, a potassium atom can be modelled simplistically for some purposes as a single electron around a core charge of plus one (+19-2-8-8) as, at least as a first approximation, we can treat all the charges within the outermost N (4th) electron shell (the 19 protons and 18 electrons) as if a single composite charge at the centre of the atom. 2

Dubious physics

Whilst I suspect that the poor quality of the English and the limited detail included in this appendix may well mean I am missing part of the argument here, I am not convinced. Besides the credibility issue (how can so many scientists have missed this for so long?) which should never be seen as totally excluding unorthodox ideas (the same thing could have been asked about most revolutionary scientific breakthroughs) my understanding is that there are already some quite sophisticated models of nuclear structure which have evolved alongside programmes of emprical research and which are therefore better supported than Henadzi's somewhat speculative model.

I must confess to not understanding the relevance of the point charge issue as this assumption/simplification would seem to work with Henadzi's model – from well outside the sphere defined by the nucleus plus low lying electrons the observed charge would be the net charge as if located at a central point, so the apparent nuclear charge would indeed be less than the true nuclear charge.

But my main objection would be the way electrostatic forces are discussed and, in particular, two features of the language:

Naked protons

protons must be repelled, and also attracted by an electronic fur coat…

I was not sure what was meant by "protons must be repelled, and also attracted by an electronic fur coat". The repulsion between protons in the nucleus is balanced by the strong nuclear force – so what is this electronic 'fur coat'?

This did remind me of common alternative conceptions that school students (who have not yet learned about nuclear forces) may have, along the lines that a nucleus is held together because the repulsion between protons is balanced by their attraction to the ('orbiting') electrons. Two obvious problems with this notion are that

  • the electrons would be attracting protons out of the nucleus just as they are repelling each other (that is, these effects reinforce, not cancel), and
  • the protons are much closer to each other than to the electrons, and the magnitude of force between charges diminishes with distance.

Newton's third law and Coulomb's law would need to be dis-applied for an electronic effect to balance the protons' mutual repulsions. (On Henadzi's model the conjectured low lying electrons are presumably orbiting much closer to the nucleus than the 1s electrons in the K shell – but, even so, the proton-electron distance will be be much greater than the separation of protons in the nucleus.)3

But I may have misunderstood what Henadzi's meant here by the attraction of the fur coat and its role in the model.

A new correspondence principle?

if a proton gets into the nucleus for some reason, then the corresponding electron will be on the very 'low' orbit

Much more difficult to explain away is the suggestion that "if a proton gets into the nucleus for some reason, then the corresponding electron will be on the very 'low' orbit". Why? This is not explained, so it seems assumed readers will simply understand and agree.

In particular, I do not know what is meant by 'the corresponding electron'. This seems to imply that each proton in the nucleus has a corresponding electron. But electrons are just electrons, and as far as a proton is concerned, one electron is just like any other. All of the electrons attract, and are attracted by, all of the protons.

Confusing a teaching scheme for a mechanism?

This may not always be obvious to school level students, especially when atomic structure is taught through some kind of 'Aufbau' scheme where we add one more proton and one more electron for each consecutive element's atomic structure. That is, the hydrogen atom comprises of a proton and its 'corresponding' electron, and in moving on to helium we add another proton, with its 'corresponding' electron and some neutrons. These correspond only in the sense that to keep the atom neutral we have to add one negative charge for each positive charge. They 'correspond' in a mental accounting scheme – but not in any physical sense.

That is a conceptual scheme meant to do pedagogic work in 'building up' knowledge – but atoms themselves are just systems of fundamental particles following natural laws and are not built up by the sequential addition of components selected from some atomic construction kit. We can be misled into mistaking a pedagogic model designed to help students understand atomic structure for a representation of an actual physical process. (The nuclei of heavy elements are created in the high-energy chaos inside a star – within the plasma where it is too hot for them to capture the electrons needed to form neutral atoms.)

A similar category error (confusing a teaching scheme for a mechanism) often occurs when teachers and textbook authors draw schemes of atoms combining to form molecules (e.g., a methane molecule formed from a carbon atom and four hydrogen atoms) – it is a conceptual system to work with the psychological needs for students to have knowledge built up in manageable learning quanta – but such schemes do not reflect viable chemical processes.4

It is this kind of thinking that leads to students assuming that during homolytic bond fission each atom gets its 'own' electron back. It is not so much that this is not necessarily so, as that the notion of one of the electrons in a bond belonging to one of the atoms is a fiction.

The conservation of force conception (an alternative conception)

When asked about ionisation of atoms it is common for students to suggest that when an electron is removed from an atom (or ion) the remaining electrons are attracted more strongly because the force for the removed electron gets redistributed. It is as if within an atom each proton is taking care of attracting one electron. In this way of thinking a nucleus of a certain charge gives rise to a certain amount of force which is shared among the electrons. Removing an electron means a greater share of the force for those remaining. This all seems intuitive enough to many learners despite being at odds with basic physical principles (Taber, 1998).

I am not deducing that Henadzi, apparently a retired research scientist, shares these basic misconceptions found among students. Perhaps that is the case, but I would not be so arrogant as to diagnose this just from the quoted text. But that is my best understanding of the argument in the paper. If that is not what is meant, then I think the text needs to be clearer.

The revolution will not be televised…

In conclusion, this paper, published in what is supposedly a research journal, is unsatisfactory because (a) it makes some very major claims that if correct are extremely significant for chemistry and perhaps also physics, but (b) the claims are tucked away in an appendix, are not fully explained and justified, and do not properly cite work referred to; and the text is sprinkled with typographic errors, and seems to reflect alternative conceptions of basic science.

I very much suspect that Henadzi's revolutionary ideas are just wrong and should rightly be ignored by the scientific community, despite being published in what claims to be a peer-reviewed (self-describing 'leading international') research journal.

However, perhaps Henadzi's ideas may have merit – the peer reviewers and editor of the journal presumably thought so – in which case they are likely to be ignored anyway because the claims are tucked away in an appendix, are not fully explained and justified, and do not properly cite work referred to; and the text is sprinkled with typographic errors, and seems to reflect alternative conceptions of basic science. In this case scientific progress will be delayed (as it was when Mendel's work was missed) because of the poor presentation of revolutionary ideas.

How does the editor of a peer-reviewed journal move to a decision to publish in 4 days?
Let down by poor journal standards

So, either way, I do not criticise Henadzi for having and sharing these ideas – healthy science encompasses all sorts of wild ideas (some of which turn out not to have been so wild as first assumed) which are critiqued, tested, and judged by the community. However, Henadzi has not been well supported by the peer review process at the journal. Even if peer reviewers did not spot some of the conceptual issues that occurred to me, they should surely have noticed the incompleteness of the argument or at the very least the failures of syntax. But perhaps in order to turn the reviews around so quickly they did not read the paper carefully. And perhaps that is how the editor, Professor Nour Shafik Emam El-Gendy of the Egyptian Petroleum Research Institute, was able to move to a decision to publish four days after submission.5

If there is something interesting behind this paper, it will likely be missed because of the poor presentation and the failure of peer review to support the author in sorting the problems that obscure the case for the proposal. And if the hypothesis is as flawed as it seems, then peer review should have prevented it being published until a more convincing case could be made. Either way, this is another example of a journal rushing to publish something without proper scrutiny and concern for scientific standards.


Works cited

Footnotes:

1 My understanding of the conduction band in a metal is that due to the extensive overlap of atomic orbitals, a great many molecular orbitals are formed, mostly being quite extensive in scope ('delocalised'), and occurring with a spread of energy levels that falls within an energy band. Although strictly the molecular orbitals are at a range of different levels, the gaps between these levels are so small that at normal temperatures the 'thermal energy' available is enough for electrons to readily move between the orbitals (whereas in discrete molecules, with a modest number of molecular orbitals available, transitions usually require absorption of higher energy {visible or more often} ultraviolet radiation). So, this spread of a vast number of closely spaced energy levels is in effect a continuous band.

Given that understanding I could not make sense of these schematic diagrams. They SEEM to show the number of conduction electrons in the 'conduction band' as being located on, and moving around, a single atom. But I may be completely misreading this – as they are meant to be (cross sections through?) a tube.

"we consider a strongly simplified one- dimensional case of the conduction band. Option one: a thin closed tube, completely filled with electrons except one. The diameter of the electron is approximately equal to the diameter of the tube. With such a filling of the zone, with the local movement of the electron, there is an opposite movement of the "place" of the non-filled tube, the electron, that is, the motion of a non-negative charge. Option two: in the tube of one electron – it is possible to move only one charge – a negatively charged electron"

Henadzi, 2019, p.2

2 The shell model is a simplistic model, and for many purposes we need to use more sophisticated accounts. For example, the electrons are not strictly in concentric shells, and electronic orbitals 'interpenetrate' – so an electron considered to be in the third shell of an atom will 'sometimes' be further from the nucleus than an electron considered to be in the fourth shell. That is, a potassium 4s electron cannot be assumed to be completely/always outside of a sphere in which all the other atomic electrons (and the nucleus) are contained, so the the core cannot be considered as a point charge of +1 at the nucleus, even if this works as an approximation for some purposes. The effective nuclear charge from the perspective of the 4s electron will strictly be more than +1 as the number of shielding electrons is somewhat less than 18.

3 Whilst the model of electrons moving around the nucleus in planetary orbits may have had some heuristic value in the development of atomic theory, and may still be a useful teaching model at times (Taber, 2013), it seems it is unlikely to have the sophistication to support any further substantive developments to chemical theory.

4 It is very common for learners to think of chemistry in terms of atoms – e.g., to think of atoms as starting points for reactions; to assume that ions must derive from atoms. This way of thinking has been called the atomic ontology.

5 I find it hard to believe that any suitably qualified and conscientious referees would not raise very serious issues about this manuscript precluding publication in the form it appears in the journal. If the journal really does use peer review, as is claimed, one has to wonder who they think suitable to act as expert reviewers, and how they persuade them to write their reports so quickly.

Based on this, and other papers appearing in the journal, I suspect one of the following:

a) peer review does not actually happen, or

b) peer review is assigned to volunteers who are not experts in the field, and so are not qualified to be 'peers' in the sense intended when we talk of academic peer review, or

c) suitable reviewers are appointed, but instructed to do a very quick but light review ignoring most conceptual, logical, technical and presentation issues as long as the submission is vaguely on topic, or

di) appropriate peer reviewers are sought, but the editor does not expect authors to address reviewer concerns before approving publication, or possibly

dii) decisions to publish sub-standard work are made by administrators without reference to the peer reviews and the editor's input

A failure of peer review

A copy of a copy – or plagiarism taken to the extreme

The journal 'Acta Scientific Pharmaceutical Sciences' is a research journal which describes itself as

"a scientific, multidisciplinary journal with 1.020 Impact factor, that strongly desires to disseminate knowledge in the field of Pharmaceutical Science and Technology"

The journal has been publishing since 2017 – one of a great number of new scientific journals competing for researchers' work. As well as the quite decent impact factor for such a new journal it also claims two other metrics – a 32% acceptance rate and period from acceptance to publication of 20-30 days.

Impact factor

The usual (that is, accepted, canonical) way of measuring impact factors is in terms of the average number of times articles in a journal are cited in other articles. Usually it is calculated over a set period (say within 5 years of publication) and based only on citations in articles in a database of journals that are considered to meet quality criteria. Some journal articles may never get cited, whilst others are cited a great deal, and the impact factor reflects an average for a journal.

However, I am wary of claims of impact factors unless I see how they are derived, as I have seen journals claiming 'impact factors' that are based on a completely different set of criteria – a bit like claiming the room temperature is 300K because the display of a chemical balance indicated '300'. (See 'Publish at speed, recant at leisure'.)

The timescale of review and publication

In the past some journals took months, even years to publish a submitted manuscript. Clearly for an author the quicker the time from submission to publication the better – at least all things being equal. They are not always equal however.

It is usually considered better to publish in a recognised high status journal where work is likely to get more attention from others working in a field, and where the publication brings more prestige to the authors and their institutions. So, an author may well feel that slow publication in a 'good' journal is preferable to quicker publication in a nondescript one.

However, time from acceptance to publication is perhaps not the most useful metric to guide authors. By the time I stepped down from editing the Royal Society of Chemistry's education journal, Chemistry Education Research and Practice, it was often publishing an advanced version of an accepted article on the day I accepted it (and the final version of record within about a week or so). Yet that ignores the time a submission spends in review.

That is the time it takes for an editor to

  • screen the submission (make sure it is within the scope of the journal and includes sufficient detail for a careful evaluation),
  • identify and invite expert reviewers,
  • receive back their reports,
  • consider these and reach a decision
  • ask authors to make any revisions seen necessary
  • receive back a corrected/revised submission
  • decide whether this seems to meet the changes needed
  • and whether the revised revisions also needs to go back to reviewers

Sometimes this process can be quick – sometimes it may be drawn out with a number of cycles of revision before authors satisfy reviewers/editors and a manuscript is accepted. Expert reviewers who are highly respected in their fields are often very busy and get many request to review.

So, average time from submission to acceptance would seem to be a key metric both because it may help authors avoid journals where editors and reviewers are very slow to turn around work, and because if this period is very short then it may bring into question whether there is rigorous review.

Acceptance rate

In this regard, the journal's claimed acceptance rate, 32% looks healthy. Two thirds of material submitted to the journal is (by deduction) rejected as not suitable for publication. Assuming this figure is accurate, this does suggests that peer review is taken seriously. (One likes to trust in the honesty of others, but sadly there are many predatory journals not above being dishonest, as I have discussed in a range of postings.)

Peer review

The publisher's site certainly suggests that the publisher recognises the importance of careful peer review undertaken by "eminent reviewers", with guidance for reviewers.

"Acta Scientifica believes that, thorough peer review process is a critical factor to yield immense quality literature to be published in the journal."

https://www.actascientific.com/reviewer.php

Among the points made here, potential reviewers are guided that

"The study should possess novelty and should present the results of original research. It is required that the reported results are not published elsewhere."

The benefits of peer review are said to be

  •  "The author receives detailed and constructive feedback from experts in the field.
  •  The process can alert authors to errors or gaps in literature they may have overlooked.
  •  It can assist with making the paper more applicable to the journal readership.
  •  It may enable a discussion (between the author, reviewers, and editor) around a research field or topic.
  •  Readers can be assured that the research they are reading has been verified by subject experts." (https://www.actascientific.com/peerreview.php)

The peer review process is said to assure

  • "Submitted article is original work which has not been previously published nor is under consideration by another journal, in part or whole;
  • The article meets all applicable standards of ethics;
  • The paper is relevant to the journal's aims, scope, and readership;
  • A submitted article presents original research findings;
  • A submitted article offers a comprehensive critical review and evaluation of key literature sources for a given topic; and
  • The article is methodologically and technically sound"(https://www.actascientific.com/peerreview.php)

The publisher offers a flow chart showing the stages of the editorial and review process. The publisher also explains the advantages of the double blind peer review process (the reviewers are not told who wrote the submission, and the author is not told who reviewed their work) they operate in order to ensure "evaluation of work in the manuscripts by peers who have an expertise in the relevant field."

Checking for plagiarism

The flow chart shows that before submission are sent for review there is a screening to ensure that at least 80% of the manuscript is 'unique content' – that is, that material has not just been copied from the author's previous publications – or even someone else's

All of this seems encouraging. The impression is that Acta Scientific are genuine in their aspiration to publish quality work, and to use a rigorous peer review process to ensure this quality. This is despite the reason why I came TO be looking into their processes.

Which came first…

I recently posted in this blog about a short article in the Journal of Chemistry: Education and Research (not to be confused with the journal Chemistry Education Research and Practice) that I found to be incoherent and filled with mistakes.

When I was evaluating that article I came across another article with the same title, by the same author, in Acta Scientific Pharmaceutical Sciences. It soon became clear that these were (this was?) the same short article, published in both journals. Both articles have the same muddled language and the same errors (running words together and the like – for more details see 'Can deforestation stop indigenous groups starving?')

The chronology seems to be:

  • 14th May 2019 – da Silva submits to Acta Scientific Pharmaceutical Sciences
  • 16th May 2019 – da Silva sends the same manuscript to Journal of Chemistry: Education and Research
    20th May 2019 – Journal of Chemistry: Education and Research accepts the article for publication (4 days after submission!)
    28th May 2019 – Journal of Chemistry: Education and Research publishes the article
    7th June 2019 – Acta Scientific Pharmaceutical Sciences publishes paper
    Was da Silva frustrated with not getting his article accepted within two days of first submission? (An acceptance date for Acta Scientific Pharmaceutical Sciences is not given)

    So, the article was submitted first to Acta Scientific Pharmaceutical Sciences, but had already been published in Journal of Chemistry: Education and Research by the time it was published in Acta Scientific Pharmaceutical Sciences. Given that authors are not supposed to publish the same material in several journals, this might raise the interesting question of which journal should require the work to be retracted, and which should allow it to stand.

    A copy of a copy

    However this would be a rather pointless question, as neither of the articles can claim to be original. As I discuss in 'Can deforestation stop indigenous groups starving?', virtually the entire text is simply lifted from three prior, unacknowledged publications written by other authors – odd paragraphs have been taken from parts of more detailed papers on the topic and simply collated (in a somewhat incoherent manner) into da Silva's manuscript. Any reputable journal that spotted this would require retraction because the work is not original but is plagiarised – it is the intellectual property of other scholars.

    Why was this not spotted?

    Although the opening of the article is simply copied word for word from the abstract of a published work (which is likely to be spotted by the tool used to screen to check for 'unique content') the rest of the material (that is, more than the critical 80%) is translated from texts which are in Portuguese.

    When an expert translator produces a new version of a work in a different language, and this is done with permission, the translator is entitled to credit and the translation is considered to be a work (albeit a derivative work) in its own right. Good translations are more than mechanical substitutions, and skillful translators are much appreciated.

    However, here we have works translated, without expertise (the English is full of mistakes), presumably without permission and certainly without attribution to the original authors. The software will not have recognised the translated text as not being 'unique content'.

    However, the process of peer review is supposed to evaluate the quality of the work, and identify areas for improvement. It is difficult to believe anyone who read this very short article carefully (for either journal) could have thought it was making a coherent argument, or that it did not at least need restructuring, clarifications and corrections.

    "We ensure that all the articles published in Acta Scientific undergo integrated peer review by peers and consequent revision by authors when required."

    https://www.actascientific.com/peerreview.php

    So, despite Acta Scientific's efforts to claim careful peer review processes, and what seems a genuine aspiration to ensure article originality and quality through peer review by those with expertise in the field, somehow the journal published the copy-and-paste job that is 'The Chemistry of Indigenous Peoples'.

    Of course, for peer review to work, those asked to review have to take the role seriously.

    "Acta Scientific trusts the genuine peer review process that the reviewers carry out so that it helps us to publish the content with good essence."

    https://www.actascientific.com/reviewer.php

    I would like to believe that Acta Scientific's fine claims about peer review ARE sincere, and perhaps in this case it was just that their trust was betrayed by sloppy reviewers.

    Work cited:
    • da Silva, M. A. l. G. (2019). The Chemistry of Indigenous Peoples. Acta Scientific Phamaceutical Sciences, 3 (7), 20-21.
    • da Silva, M. A. G. (2019) The Chemistry of Indigenous Peoples. Journal of Chemistry: Education Research and Practice, 3 (1), 1-2

    Laboratory safety – not on the face of it

    An invalid research instrument for testing 'safety sign awareness'

    Keith S. Taber

    I was recently invited to write for the 'Journal of Chemistry: Education Research and Practice' (not to be confused with the well-established R.S.C. Journal 'Chemistry Education Research and Practice') which describes itself as "a leading International Journal for the publication of high quality articles". It is not.

    From the Journal Homepage

    I already had reason to suspect this of being a predatory journal (one that entices authors to part with money to publish work without adhering to the usual academic standards and norms). But as I had already reached that judgement before the journal had started publishing, I decided to check out the quality of the published work.

    The current issue, at the time of writing, has five articles, only one of which is educational in nature: 'Chemistry Laboratory Safety Signs Awareness Among Undergraduate Students in Rivers State'.

    Below I describe key aspects of this study, including some points that I would have expected to have been picked-up in peer review, and therefore to have been addressed before the paper could have been published.

    Spoiler alert

    My main observation is that the research instrument used is invalid – I do not think it actually measures what the authors claim it does. (As the article is published with a open-access license1, I am able to reproduce the instrument below so you can see if you agree with me or not.)

    'Chemistry laboratory safety signs awareness among undergraduate students in Rivers State'

    A study about chemistry laboratory safety signs awareness?

    Laboratory safety is very important in chemistry education, and is certainly a suitable topic for research. A range of signs and symbols are used to warn people of different types of potential chemical hazard, so learning about these signs is important for those working in laboratories; and so investigating this aspect of learning is certainly a suitable focus for research.

    Motivating a study

    As part of a published research study authors are expected to set out the rationale for the study – to demonstrate, usually based on existing literature, that there is something of interest to investigate. This can be described as the 'conceptual framework' for the study. This is one of the aspects of a study which is usually tested in peer-review where manuscripts submitted to a journal are sent to other researchers with relevant expertise for evaluation.

    The authors of this study, Ikiroma, Chinda and Bankole, did begin by discussing aspects of laboratory safety, and reporting some previous work around this topic. They cite an earlier study that had been carried out surveying second-year science education students at Lagos State University, Nigeria, and where:

    "The result of the study revealed 100% of the respondents are not aware of the laboratory sign and symbols" 2

    Ikiroma, Chinda & Bankole, 2021: 50

    This would seem a good reason to do follow-up work elsewhere.

    Research questions and hypotheses

    A study should have one or more research questions. These will be quite general in more open-ended 'discovery research' (exploratory enquiry), but need to be more specific in 'confirmatory research' such as experiments and surveys. This study had both specific research questions and null hypotheses

    "Research Questions

    1. What is the percentage awareness level of safety signs among undergraduate Chemistry students?

    2. What is the difference in awareness level of safety signs between undergraduate Chemistry Education students and Chemistry Science students?

    3. To what extent do the awareness levels of safety signs among undergraduate Chemistry students depended on Institutional types?"

    Hypotheses

    1. There is no significant difference in awareness level of safety signs between undergraduate Chemistry Education students and Chemistry Science students

    2. The awareness levels of safety signs among undergraduate Chemistry students are not significantly dependent on Institutional types."

    Ikiroma, Chinda & Bankole, 2021: 50

    These specific questions and hypotheses do not seem to be motivated in the conceptual framework. That is, a reader has not been given any rationale to think that there are reasons to test for differences between these different groups. There may have been good reasons to explore these variables, but authors of research papers are usually expected to share their reasoning with reader. (This is something which one would expect to be spotted in peer review, leading to the editor asking the authors to revise their submission to demonstrate the background behind asking about these specific points.)

    It is not explained quite what 'institutional types' actually refers to. From the way results are discussed later in the paper (p.53), 'Institutional types' seems to be used here simply to mean different universities

    Sampling – how random is random?

    The sample is described as:

    "A total of 60 year three undergraduate students studying Chemistry Education (B.Sc. Ed) and Pure Chemistry (B.Sc.) were randomly drawn from three universities namely; University of Port Harcourt (Uniport), Rivers State University (RSU) and Ignatius Ajuru University of Education (IAUE) with each university contributing 20 students."

    Ikiroma, Chinda & Bankole, 2021: 50

    This study was then effectively a survey where data was collected from a sample of a defined population (third undergraduate students studying chemistry education or pure chemistry in any of three named universities in one geographical area) to draw inferences about the whole population.

    Randomisation is an important process when it is not possible to collect data from the whole population of interest, as it allows statistics to be used to infer from the sample what is likely in the wider population. Ideally, authors should briefly explain how they have randomised (Taber, 2013) so readers can judge if the technique used does really give each member of the population (here one assumes 3rd year chemistry undergraduates in each of the Universities) an equal chance of being sampled. (If the authors are reading this blog, please feel free to respond to this point in the comments below: how did you go about the randomisation?)

    Usually in survey research an indication would be given of the size of the population (as a random sample of 0.1% of a population gives results with larger inherent error than a random sample of 10%). That information does not seem to be provided here.

    Even if the authors did use randomisation, presumably they did not randomise across the combined population of "year three undergraduate students studying Chemistry Education (B.Sc. Ed) and Pure Chemistry (B.Sc.)…from three universities" as they would have been very unlikely to have ended up with equal numbers from the three different institutions. So, probably this means they took (random?) samples from within each of the three sub-populations (which would be sensible to compare between them).

    It later becomes clear that of the 60 sampled students, 30 were chemistry education students and 30 straight chemistry students (p.53) – so again it seems likely that sampling was done separately for the two types of course. There does not seem to be any information on the break down between university and course, so it is possible there were 10 students in each of 6 cells, if each University offered both courses:

    chemistry educationpure chemistrytotal
    University of Port Harcourt??20
    Rivers State University??20
    Ignatius Ajuru University of Education??20
    total303060
    Sample

    Clearly this distribution potentially matters as there could be interactions between these two different variables. Consider for example that perhaps students taking pure chemistry tended to have a higher 'awareness level of safety signs' than students taking chemistry education: then (see the hypothetical example in the table below), if a sample from one university mostly comprised of pure chemistry students, and that from another university mostly of chemistry education students, then this would likely lead to finding differences between institutions in the samples even if there were no such differences between the combined student populations in the two universities. The uneven sampling from the two courses within the universities would bias the comparison between institutions.

    course 1course 2total
    University A20020
    University B101020
    Education C02020
    total303060
    A problematic sample for disentangling factors

    My best guess is the the authors appreciated that, and that all three universities taught both types of course, and the authors sampled 10 students from each course in each of the universities. Perhaps they even did it randomly – but it would be good to know how as I have found that sometimes authors who claim to have made random selections have not used a technique that would strictly support this claim. (And if a sample is not random we can have much less confidence about how it reflects the population sampled.)

    The point is that a reader of a research report should not have to guess. Often researchers (and research students) are so close to their own project that it becomes easy to assume others will know things about the work that have become taken for granted by the research team. This is where a good editor or peer reviewer can point out, and ask for, missing information that is not available to a reader.

    Ethical research?

    Sampling can also be impacted by ethics. It is one thing to select people randomly, but not all people will volunteer to help with research and it is general principle of educational research that participants should offer voluntary informed consent. Where some people agree to participate, and others do not, this may bias results if people's reasons for accepting/declining an invitation are linked to the focus of the research.

    Imagine inviting students to some research to test whether cheating (copying homework, taking reference material into examinations) can be detected by using a lie detector to questions students about their behaviours. Are those who cheat and those who are scrupulously honest likely to volunteer to take part in such research to the same extent, or might we expect most cheats to opt out?

    It is normal practice in educational research to make a brief statement that the research was carried out ethically, e.g., that participants all volunteered freely having had the purpose and nature of the research clearly explained to them. I could not find any such statement in the article, nor any requirement for authors to include this in the journal's author guidelines.

    Lack of face validity

    In research, validity is about measuring what you think you are measuring. In the school laboratory, if we saw a student completing the 'potential difference/V' column of a results table when taking readings with an ammeter we would consider the recorded results were invalid.

    I once gave a detention to a first year (Y7) student who had done something naughty that I forget now, and as we were working on a measurement topic I set her to measure the length of the corridor outside the lab. with a metre rule. Although this was an appropriate instrument, I found that she did not appreciate that in order to get a valid result she had to make sure she moved the metre stick on by the right amount (that is, one metre!) for each counted metre – instead she would move the metre stick by about half its length! Some pupils may resent being in detention and deliberately respond with sloppy work, but in this case it seemed the fault was with the teacher who had overestimated prior knowledge and consequently given an insufficiently detailed explanation of the task!

    In research we have to be confident that an instrument is measuring what it is meant to. This may mean testing and calibrating – using the instrument somewhere where we already have a good measure and checking it gives the expected answers (like checking a clock against the Greenwich pips on the radio) before using it in research to measure an unknown.

    In educational studies we can sometimes spot invalid instruments because they lack face validity – that is, 'on the face of it' an instrument does not seem suitable to do the job. Certainly when 'we' are people with relevant expertise. Consider an instrument to test understanding of trigonometry which consisted of the item: "discuss five factors which contributed to the 'industrial revolution' in eighteenth century Britain". We might suspect this could be used to measure something, but probably not understanding of trigonometry. This would be an invalid test to use for that purpose.

    Awareness level of safety signs?

    The focus of Ikiroma, Chinda & Bankole's study was 'awareness level of safety signs'. Strictly this only seems to mean being aware of such signs3, but I read this to mean that the authors wanted to know if students recognised the meaning of different signs commonly used: whether they were aware what particular signs signified.

    The 'Chemistry Laboratory Test on Safety Signs' instrument:

    The authors report they used:

    A well validated and researchers['] constructed test instrument, titled, Chemistry Laboratory Test on Safety Signs (CLTSS) which had an internal reliability index of 0.94 via Cronbach Alpha was used for data collection in the study. The questions in the test required the students to match a list of 20 chemicals in column A and of nine safety signs accompanied with a short description in column B. This aimed to reduce the wrong response because the students incorrectly considered only the symbol.

    Ikiroma, Chinda & Bankole, 2021: 50
    Validation

    A key question that an editor would expect peer reviewers to consider is whether the instrumentation used in research can provide valid findings. Where this is not clear in a manuscript submited to a journal, the editor should (if not rejecting the paper) ask for this to be addressed in a revision of the manuscript. Validity is clearly critical, and research should not be published it if makes claims based on invalid instrumentation.

    Therefore when reporting research instruments it is usually expected that authors explain how they tested for validity – simply stating something is well-validated does not count! Face validity might be tested by asking carefully identified experts to see if they think the instrument tests what is claimed (so here, perhaps asking university chemistry lecturers – "do you think these questions are suitable for elciiting undergraduate students' awareness levels of safety signs?").

    If an instrument passed this initial test, more detailed work would be undertaken. Here perhaps a small sample of students from a closely related poopulation to that being studied (pehaps second year chemistry students in the same universities; or third year chemistry students from another university) would be asked to complete the instrument using a 'think aloud' protocol where they explain their thinking as they answer the questions – or would be interviewed about their awareness of safety signs as well as comepleting the instrument to triangulate reponses to the instrument against interview responses.

    Cronbach's alpha measures the internal consistency of an scale (Taber, 2018), but offers no assurance of validity. (If a good set of items meant to test enjoyment of school science were used instead to measure belief in ghosts the set of items would still show the same high level of internal consistency despite being used for a totally invalid purpose.)

    Chemistry Laboratory Test on Safety Signs (Ikiroma, Chinda & Bankole, 2021: 51)

    So, what the students had to do was match a chemical (name) with the appropriate hazard sign.What was being tested was knowledge of the hazards associated with laboratory chemicals (an important enough topic, but not what was promised).

    Had the signs not been labelled, then the items would have required BOTH knowing about the hazards of specific chemicals AND knowing which sign was used for the associated hazards. However, Ikiroma, Chinda & Bankole had actually looked "to reduce the wrong response because the students incorrectly considered only the symbol" (emphasis added). That is, they had built into the test instrument a means to ensure it did not test awareness of 'safety signs' (what they were supposedly interested in) and only measured awareness of the hazards associated with particular substances.

    What Ikiroma, Chinda & Bankolehad tested was potentially useful and interesting – but it was not what they claimed. The paper title, the research questions, and the hypotheses (and consequently their statements of findings) were all misleading in that regard. One would have expected the editor and peer reviewers should have noticed that and required corrections before publication was considered.

    Quality assurance?

    The journal's website claims that "Journal of Chemistry: Education Research and Practice is an international peer reviewed journal…" Peer review involves the editor rejecting poor submissions, and ensuring that the quality of what is published by arranging that experts in the field scrutinise submissions to ensure they meet quality standards. Peer reviewers are chosen for expertise related to the specific topic of the specific submission. In particular, reviewers will ask for changes where these seem to be needed, and the editor of a journal decides to publish only when she is satisfied sufficient changes have been made in response to review reports.

    Publishing poor quality work, especially work with glaring issues, reflects badly on the authors, the journal, and the editor.4

    The journal accepted the paper about 9 days after submision

    In this case the editor – Professor Nour Shafik Emam El-Gendy of the Environmental Sciences & Nanobiotechnology Egyptian Petroleum Research Institute in Egypt – appears to have taken just over a week to

    • arrange peer review,
    • receive and consider the referee reports, and
    • report back to the authors asking them for any changes she felt were needed, and (once she received any revisions that may have been requested) then
    • decide the paper was ready for publication in this supposed 'leading international journal'.

    That could be seen as impressive, but actually seems incredible.

    Peer review is not just about sorting good work from bad, it is also about supporting authors by showing them where their work needs to be improved before it is put on public display. Peer review is as much about improving work as selecting.

    I do not know if Ikiroma, Chinda & Bankole were expected to pay the standard charge for publishing – that is $999 for this journal – but, if so, I do not think they got value for money. Given the level of support they seem to have received from the peer review process, I think they should be entitled to a refund.

    Work cited:
    • Ikiroma, B., Chinda, W., & Bankole, I. S. (2021). Chemistry Laboratory Safety Signs Awareness Among Undergraduate Students in Rivers State. Journal of Chemistry: Education Research and Practice, 5(1), 47-54.
    • Oludipe, O. S., & Etobro, B. A. (2018). Science Education Undergraduate Students' Level of Laboratory Safety Awareness. Journal of Education, Society and Behavioural Science, 23(4), 1-7. https://doi.org/10.9734/JESBS/2017/37461
    • Taber, K. S. (2013). Non-random thoughts about researchChemistry Education Research and Practice, 14(4), 359-362. doi: 10.1039/c3rp90009f. [Free access]
    • Taber, K. S. (2014). Ethical considerations of chemistry education research involving "human subjects". Chemistry Education Research and Practice, 15(2), 109-113. [Free access]
    • Taber, K. S. (2018). The Use of Cronbach's Alpha When Developing and Reporting Research Instruments in Science Education. Research in Science Education, 48, 1273-1296. doi:10.1007/s11165-016-9602-2

    Notes:

    1: "All works published by 'Journal of Chemistry: Education Research and Practice' is [sic, are] under the terms of Creative Commons Attribution License. This permits anyone to copy, distribute, transmit and adapt the work provided the original work and source is appropriately cited." (https://opastonline.com/journal/journal-of-chemistry-education-research-and-practice/author-guidelines)

    2: This is indeed what these authors claim – by which they seem to mean none of the students tested reaches a score of half-marks. (I infer that from the way they report other results in the same study.) They report that, of 50 respondents,

    "21 (42%) could not identify correctly all [sic, could not identify correctly any of] the eight symbols presented in the survey. 24 (48%) was only able to identify one out of eight symbols presented, and 5 (10%) could identify just two. Thus, it is alarming to discover that *100% of the respondents are not aware of the laboratory signs and symbols"

    Oludipe & Etobro, 2018: 5

    (The asterisk seems to indicate which rows from a result table are being summed to give 100%.)

    3. If we simply wanted to test for awareness of safety signs we might think of displaying some jars of reagents and asking something like "is there any way you would know about which of these chemicals present particular risks?" or "how might we find out about special precautions we should take when working with these reagents?" and see if the respondents pointed out the safety signs printed on the labels.

    4. Journals that attract high volumes of submissions may have a team of editors to share the work. Some journals with several editors acknowledge the specific editor who handles each published study.

    I suspect that some predatory journals appoint editors who do not actually see the submissions (as it is difficult to see how qualified editors would approve some of the nonsense published in some journals), which are instead handled by administrators who may not be experts in the field (and so may not be in a position to judge the expertise of peer reviewers). If this is so, the editor should be described as an 'honorary editor' as misrepresenting a journal as edited by a subject expert is dishonest.

    Can deforestation stop indigenous groups starving?

    One should be careful with translation when plagiarising published texts

    Keith S. Taber


    The mastering of the art of deforestation is what enables the inhabitants of the Amazon not to die of hunger.


    Marcos Aurélio Gomes da Silva, Federal University of Juiz de Fora, Brazil

    I have been reading some papers in a journal that I believed, on the basis of its misleading title and website details, was an example of a poor-quality 'predatory' journal. That is, a journal which encourages submissions simply to be able to charge a publication fee, without doing the proper job of editorial scrutiny. I wanted to test this initial evaluation by looking at the quality of some of the work published.

    One of the papers I decided to read, partly because the topic looked of particular interest, was 'The Chemistry of Indigenous Peoples'.

    Image by 139904 from Pixabay 

    It is important to learn and teach about the science of indigenous populations

    Indigenous science is a very important topic for science education. In part this is because of the bias in many textbook accounts of science. There are examples of European scientists being seen as discovers of organisms, processes and so on, that had been long known by indigenous peoples. It is not even that the European's re-discovered them as much as that they were informed by people who were not seen to count as serious epistemic agents. Species were often named after the person who could afford to employ collectors (often paid a pittance) to go and find specimens. This is like a more serious case of the PhD supervisor claiming the student's work as the student worked for them!

    Indigenous cultures often encompass knowledge and technologies that have worked effectively, and sustainable, for millennia but which do not count as proper science because they are not framed in terms of the accepted processes of science (being passed on orally and by example, rather than being reported in Nature or Science). Of course the situation is more nuanced that that – often indigenous cultures do not (need to) make the discriminations between science, technology, myth, ritual, art, and so forth that have allowed 'modern' science to be established as a distinct tradition and set of practices.

    But science education that ignores indigenous contributions to formal science and seems to dismiss cultural traditions and ecological knowledge offers both a distorted account of science's history, and an inherent message about differential cultural worth to children.

    That is a rather brief introduction to a massive topic, but perhaps indicates why I was keen to look at the paper in the so-called 'Journal of Chemistry: Education Research and Practice' on 'The Chemistry of Indigenous Peoples' (da Silva, 2019)

    Sloppy production values

    "The Chemistry of Indigenous Peoples" had moved from submission to acceptance in 4 days, and had been published just over a week later.

    Not a lot of time for a careful peer review process

    This 'opinion article' was barely more than one page (I wondered if perhaps the journal charges authors by the word – but it seems to charge authors $999 per article), and was a mess. For example, consider the two paragraphs reproduced below: the first starts in lower case, and ends with the unexplained 'sentence', "art of dewatering: cassava"; and the second is announced as being about development (well, 'devel- opment' actually) which seems to be considered the opposite of fermentation, but then moves straight to 'deworming' which is said to be needed due to the toxic nature of some plants, and ends up explaining that deforestation is essential for the survival of indigenous people (rather contrary to the widespread view that deforestation is destroying their traditional home and culture).

    The closing three paragraphs of the article left me very confused:

    "In this sense, we  [sic – this is a single authored paper] will examine the example of the cassava root in more detail so that we can then briefly refer to other products and processes. The last section will address some of the political implications of our perspective.

    In Brazil, manioc (Manihot esculenta) is known under different names in several regions. In the south of the country, it is also called "aipim", in central Brazil, "maniva", "manaíba", "uaipi", and in the north, "macaxeira" or "carim".

    In this essay, we intend to show that, to a certain extent, companies,
    a process of invention of the Indian Indians of South America, and
    still are considerable, as businesses, until today, millions of people and institutions benefit in the Western world. We seek to provide information from a few examples regarding chemical practices and biochemical procedures for the transformation of substances that
    are unknown in Europe."

    da Silva, 2019, p.2

    My first reading of that last paragraph made me wonder if this was just the introduction to a much longer essay that had been truncated. But then I suspected it seemed to be meant as a kind of conclusion. If so, the promised brief references to 'other products and processes' seem to have been omitted after the listing of alternative names in the paragraph about manioc (cassava), whilst the 'political implications' seemed to refer to the garbled final paragraph ("…to a certain extent, companies, a process of invention of the Indian Indians of South America, and still are considerable, as businesses…").

    I suspected that the author, based in Brazil, probably did not have English as a first language, perhaps explaining the odd phrasing and incoherent prose. But this paper is published in a (supposed) research journal which should mean that the submission was read by an editor, and then evaluated by peer reviewers, and only published once the editor was convinced it met quality standards. Instead it is a short, garbled, and in places incoherent, essay.

    Plagiarism?

    But there is worse.

    da Silva's article, with the identifed sources (none of which are acknowledged) highlighted. (The paper is published with a licence that allows reproduction.)

    I found a paper in the Portuguese language journal Química Nova called 'A química dos povos indígenas da América do Sul (The chemistry of indigenous people of Southamerica)' (Soentgen &  Hilbert, 2016).  This seems to be on a very similar topic to the short article I had been trying to make sense of – but it is a much more extensive paper. The abstract is in English, and seems to be the same as the opening of da Silva's 2019 paper (see the Table below).

    That is plagiarism – intellectual theft. Da Silva does not even cite the 2016 paper as a source.

    I do not read Portuguese, and I know that Google Translate is unlikely to capture the nuances of a scholarly paper. But it is a pretty good tool for getting a basic idea of what a text is about. The start of the 2016 paper seemed quite similar to the close of da Silva's 2019 article, except for the final sentence – which seems very similar to a sentence found elsewhere in the 'New Chemistry' article.

    This same paper seemed to be the source of the odd claims about "deworming" and the desirability of deforestation in da Silva's 2019 piece. The reference to the "opposite process" (there, poisoning) makes sense in the context of the 2016 paper, as there it follows from a discussion of the use of curare in modern medicine – something borrowed from the indigenous peoples of the Amazon.

    In da Silva's article the 'opposite process' becomes 'development', and this now follows a discussion of fermentation- which makes little sense. The substitution of 'deworming' and 'deforestation' as alternatives for 'poisoning' ('desenvenenamento') convert the original text into something quite surreal.

    So, in the same short passage:

    • desenvenenamento (poisoning) becomes development (desenvolvimento)
    • desenvenenamento (poisoning) becomes deworming (vermifugação – or deparasitamento)
    • desenvenenamento (poisoning) becomes deforestation (desmatamento)

    I also spotted other 'similarities' between passages in da Silva's 2019 article and the earlier publication (see the figure above and table below). However, it did not seem that da Silva had copied all of his article from Soentgena and Hilbert.

    Rather I found another publication by Pinto (possibly from 2008) which seemed to be the source of other parts of da Silva's 2019 paper. This article is published on the web, but does not seem to be a formal publication (in an academic journal or similar outlet), but rather material prepared to support a taught course. However, I found the same text incorporated in a later extensive journal review article co-written by Pinto (Almeida, Martinez & Pinto, 2017).

    This still left a section of da Silva's 2019 paper which did not seem to orignate in these two sources. I found a third Portuguese language source (Cardoso, Lobo-santos, Coelho, Ayres & Martins, 2017) which seemed to have been plagiarised as the basis of this remaining section of the article.

    As this point I had found three published sources, predating da Silva's 2019 work, which – when allowing for some variation in translation into English – seemed to be the basis of effectively the whole of da Silva's article (see the table and figure).

    Actually, I also found another publication which was even closer to, indeed virtually identical to, da Silva's article in the Journal of Chemistry: Education Research and Practice. It seems that not content with submitting the plagiarised material as an 'opinion article' there, da Silva had also sent the same text as a 'short communication' to a completely different journal.

    (Read 'A failure of peer review: A copy of a copy – or plagiarism taken to the extreme')

    Incredible coincidence? Sloppy cheating? Or a failed attempt to scam the scammers?

    Although da Silva cited six references in his paper, these did not include Cardoso et al. (2017), Pinto (2008)/Almeida et al. (2017) or Soentgena & Hilbert (2016). Of course there is a theoretical possibility that the similarities I found were coincidences, and the odd errors were not translation issues but just mistakes by da Silva. (Mistakes that no one at the journal seems to have spotted.) It would be a very unlikely possibility. So unlikely that such an explanation seems 'beyond belief'.

    It seems that little, if anything, of da Silva's text was his own, and that his attempt to publish an article based on cutting sections from other people's work and compiling them (without any apparent logical ordering) into a new peice might have fared better if he too had taken advantage of Google Translate (which had done a pretty good job of helping me identify the Portuguese sources which da Silva seemed to have been 'borrowed' for his English language article). In cutting and pasting odd paragraphs from different sources da Silva had lost the coherence of the original works leading to odd juxtapositions and strangely incomplete sections of text. None of this seems to have been noticed by the journal editor or peer reviewers.

    Or, perhaps, I am doing da Silva an injustice.

    Perhaps he too was suspicious of the quality standards at this journal, and did a quick 'cut and paste' article, introducing some obvious sloppy errors (surely translating the same word,'desenvenenamento', incorrectly in three different ways in the same paragraph was meant as some kind of clue), just to see how rigorous the editing, peer review and production standards are?

    Given that the article was accepted and published in less than a fortnight, perhaps the plan backfired and poor da Silva found he had a rather unfortunate publication to his name before he had a chance to withdraw the paper. Unfortunate? If only because this level of plagiarism would surely be a sacking offence in most academic institutions.

    Previously published materialEnglish translation (Google Translate)The Chemistry of Indigenous Peoples (2019)
    Marcos Aurélio Gomes da Silva
    The contribution of non-European cultures to science and technology, primarily to chemistry, has gained very little attentions until now.[Original was in English]The contribution of non-European cultures to science and technology, primarily to chemistry, has gained very little attentions until now.
    Especially the high technological intelligence and inventiveness of South American native populations shall be put into a different light by our contribution.Especially, the high technological intelligence and inventiveness of South American native populations shall be put into a different light by our contribution.
    The purpose of this essay is to show that mainly in the area of chemical practices the indigenous competence was considerable and has led to inventions profitable nowadays to millions of people in the western world and especially to the pharmacy corporations.The purpose of this study was to show that mainly in the area of chemical practices; the indigenous competence was considerable and has led to inventions profitable nowadays to millions of people in the western world and especially to the pharmacy corporations.
    We would like to illustrate this assumption by giving some examples of chemical practices of transformation of substances, mainly those unknown in the Old World.We would like to illustrate this assumption by giving some examples of chemical practices of transformation of substances, mainly those unknown in the old world.
    The indigenous capacity to gain and to transform substances shall be shown here by the manufacture of poisons, such as curare or the extraction of toxic substances of plants, like during the fabrication of manioc flower.The indigenous capacity to gain and to transform substances shall be shown here by the manufacture of poisons, such as curare or the extraction of toxic substances of plants, like during the fabrication of manioc flower.
    We shall mention as well other processes of multi-stage transformations and the discovery and the use of highly effective natural substances by Amazonian native populations, such as, for example, rubber, ichthyotoxic substances or psychoactive drugs.
    We shall mention as well other processes of multi-stage transformations and the discovery and the use of highly effective natural substances by Amazonian native populations, such as, for example, rubber, ichthyotoxic substances or psychoactive drugs.
    Soentgena & Hilbert, 2016: 1141
    A partir disso, os povos indígenas da América do Sul não parecem ter contribuído para a química e a tecnologia moderna.From this, the indigenous peoples of South America do not seem to have contributed to modern chemistry and technology.The indigenous peoples of South America do not seem to have contributed to modern chemistry and technology.
    Em contraponto, existem algumas referências e observações feitas por cronistas e viajantes do período colonial a respeito da transformação, manipulação e uso de substâncias que exigem certo conhecimento químico como,6 por exemplo: as bebidas fermentadas, os corantes (pau-brasil, urucum), e os venenos (curare e timbó).In contrast, there are some references and observations made by chroniclers and travelers from the colonial period about the transformation, manipulation and use of substances that require certain chemical knowledge,6 for example: fermented beverages, dyes (pau-brasil, annatto), and poisons (curare and timbó).
    In contrast, there are some references and observations made by chroniclers and travelers from the colonial period regarding the transformation, manipulation and use of substances that require certain chemical knowledge, such as fermented beverages, dyes (pigeon peas, Urucum), and the poisons (Curare and Timbó).
    Mesmo assim, estas populações acabam sendo identificadas como "selvagens primitivos" que ainda necessitam de amparo da civilização moderna para que possam desenvolver-se.Even so, these populations end up being identified as "primitive savages" who still need the support of modern civilization so that they can develop.Even so, these populations end up being identified as "primitive savages" who still need the support of modern civilization in order for them to develop.
    (Soentgena & Hilbert, 2016: 1141)
    A pintura corporal dos índios brasileiros foi uma das primeiras coisas que chamou a atenção do colonizador português.The body painting of Brazilian Indians was one of the first things that caught the attention of the Portuguese colonizer.  Body painting of the Brazilian Indians was one of the first things that caught the attention of the Portuguese colonizer.  
    Pero Vaz de Caminha, em sua famosa carta ao rei D.
    Manoel I, já falava de uns "pequenos ouriços que os índios traziam nas mãos e da nudeza colorida das índias.
    Pero Vaz de Caminha, in his famous letter to King D. Manoel I, already spoke of "small hedgehogs that the Indians carried in their hands and the colorful nudity of the Indians.Pero Vaz de Caminha, in his famous letter to King D. Manoel I, already talked about little hedgehogs that the Indians carried in their hands.
    Traziam alguns deles ouriços verdes, de árvores, que na cor, quase queriam parecer de castanheiros; apenas que eram mais e mais pequenos.They brought some of them green hedgehogs, from trees, which in color, almost they wanted to look like chestnut trees; only that they were smaller and smaller.They brought some of them green hedgehogs, trees, who in color almost wanted to appear of chestnut trees; just that they were more and more small.
    E os mesmos eram cheios de grãos vermelhos, pequenos, que, esmagados entre os dedos, faziam tintura muito vermelha, da que eles andavam tintos; e quando se mais molhavam mais vermelhos ficavam"And they were full of small, red grains, which, crushed between the fingers, made a very red tincture, from which they were red; and when they got more wet the redder they turned"And the same were filled with red, small [sic], which, crushed between the fingers, made very red dye from the [sic] that they walked red [sic]; and when the more they wet the more red they stayed.
    (Pinto, 2008: pp1.1-2; also Almeida, Martinez & Pinto, 2017)
    Os índios do Alto Xingú pintam a pele do corpo com desenhos de animais, pássaros e peixes.The Indians of Alto Xingu paint the skin of their bodies with drawings of animals, birds and fish.The Indians of Alto Xingú paint thebody [sic] skin with animal drawings, birds and fish.
     Estes desenhos, além de servirem para identificar o grupo social ao qual pertencem, são uma
    maneira de uní-los aos espíritos, aos quais creditam sua felicidade.
    These drawings, in addition to serving to identify the social group to which they belong, are a way to unite them with the spirits, to whom they credit their happiness.These drawings besides serving to identify the social group at thewhich [sic] they belong, are a way of unite them with the spirits, to whom they credit their happiness.
    A tinta usada por esses índios é preparada com sementes de urucu, que se colhe nos meses de maio e junho.The ink used by these Indians is prepared with annatto seeds, which are harvested in May and June.The ink used by these Indians is prepared with urucu seeds , which is collected in the monthsof [sic] May and June.
    As sementes são raladas em peneiras finas e fervidas em água para formar uma pasta.The seeds are grated into fine sieves and boiled in water to form a paste.The seeds are grated in fine [sic] and boiledwater [sic] to form a paste.
    Com esta pasta são feitas bolas que são envolvidas em folhas, e guardadas durante todo o ano para as
    cerimônias de tatuagem.
    This paste is used to make balls that are wrapped in sheets, and kept throughout the year for the
    tattoo ceremonies.
    With this paste balls are made which, involved in sheets, are stored throughout the year for the tattoo ceremonies.
    A tinta extraída do urucu também é usada para tingir os cabelos e na confecção de máscaras faciais.The dye extracted from the annatto is also used to dye hair and make facial masks.The ink extracted from Urucu is also used dyeing hair and making tion [sic] of facial masks.
    (Pinto, 2008: p.4; also Almeida, Martinez & Pinto, 2017)
    O urucu é usado modernamente para colorir manteiga, margarina, queijos, doces e pescado defumado, e o seu corante principal – a bixina – em filtros solares.  
    Annatto is used in modern times to color butter, margarine, cheeses, sweets and smoked fish, and its main coloring – bixin – in sunscreens.Urucu is used coloring page [sic] butter, margarine, cheeses, sweets andsmoked [sic] fish, and its colorant main – bixina – in solar filters.
    (Pinto, 2008: p.4; also Almeida, Martinez & Pinto, 2017)
    Assim, foram identificados possíveis conteúdos de Química que poderiam estar relacionados com a preparação do Tarubá, como misturas, separação de misturas e processos de fermentação.  Thus, possible contents of Chemistry were identified that could be related to the preparation of Tarubá, such as mixtures, separation of mixtures and fermentation processes.  it was possible to identify possible contents of Chemistry that could be related to the preparation of Tarubá, such as mixtures, separation of mixtures and fermentation processes.
    O processo de preparação da bebida feita da mandioca ralada, envolve a separação da mistura entre o sólido da massa da mandioca e o líquido do tucupi, feito através do processo de filtração com o tipiti, instrumento tradicional indígena.The process of preparing a drink made from grated cassava involves separating the mixture between the solid of the cassava mass and the liquid from the tucupi, made through the filtration process with tipiti, a traditional indigenous instrument.The process of preparation of the beverage made from grated cassava involves the separation of the mixture between the solid of the cassava mass and the liquid of the tucupi, made through the filtration process with the tipiti, a traditional Indian instrument.
    A massa é peneirada, assada e colocada em repouso por três dias, quando ocorre o processo de fermentação, em que o açúcar, contido na mandioca, é processado pelos microrganismos e transformado em outras substâncias, como álcool e gases.The dough is sifted, baked and put to rest for three days, when the fermentation process takes place, in which the sugar, contained in the cassava, is processed by microorganisms and transformed into other substances, such as alcohol and gases.The dough is sieved, roasted and put to rest for three days, when the fermentation process occurs, in which the sugar contained in cassava is processed by microorganisms and transformed into other substances such as alcohol and gas.
    Após esse período, se adicionam água e açúcar à massa coada, estando a bebida pronta para ser consumida.After this period, water and sugar are added to the strained mass, and the drink is ready to be consumed.After this period, water and sugar are added to the batter, and the beverage is ready to be consumed.
    (Cardoso, Lobo-santos, Coelho, Ayres, Martins, 2017).
    art of dewatering: cassava
    Agora gostaríamos de voltar a atenção para o processo oposto, o desenvenenamento.  Now we would like to turn our attention to the opposite process, the poisoning.  Now we would like to turn our attention to the opposite process, the devel- opment [sic].  
    Ainda que não exija técnicas tão sofisticadas quanto a produção de substâncias, o desenvenenamento é um proce- dimento fundamental para as pessoas que vivem e queiram sobreviver na floresta tropical amazônica, tendo em vista que muitas plantas de lá produzem veneno em virtude de seu metabolismo secundário.Although it does not require such sophisticated techniques as the production of substances, poisoning is a fundamental procedure for people who live and want to survive in the Amazon rainforest, considering that many plants there produce poison due to their secondary metabolism.Although it does not require techniques as sophisticated as the production of substances, the deworming is a fundamental procedure for the people who live and want to survive in the rainforest Amazon, since many plants of there produce poison by virtue of its secondary metabolism.
    Afinal, a forma que muitas espécies de plantas possuem para evitar a mordida de insetos é a produção de recursos químicos defensivos.After all, the way that many plant species have to avoid insect bites is the production of defensive chemical resources.After all, the way that many plant species have to avoid insect bite is the production of defensive chemical resources.
    Quem quer sobreviver na floresta tropical precisa saber como neu- tralizar ou afastar essas substâncias tóxicas produzidas pelas próprias plantas.Anyone who wants to survive in the rainforest needs to know how to neutralize or remove these toxic substances produced by the plants themselves.Whoever wants to survive in the rainforest needs to know how to neutralize or ward off these toxic substances produced by the plants themselves.
    O domínio da arte do desenvenenamento é o que possibilita os habitantes da Amazônia a não morrerem de fome. Mastering the art of poisoning is what makes it possible for the inhabitants of the Amazon not to starve.The mastering of the art of deforestation is what enables the inhabitants of the Amazon not to die of hunger.
    (Soentgena & Hilbert, 2016: 1145)
    Nesse sentido, examinaremos o exemplo da raiz de mandioca de maneira mais detalhada para então, na sequência, fazermos referência sumária a outros produtos e processos.  In this sense, we will examine the cassava root example in more detail and then, in the sequence, make a brief reference to other products and processes.In this sense, we will examine the example of the cassava root in more detail so that we can then briefly refer to other products and processes.  
    A última seção tratará de algumas implicações políticas de nossa perspectiva.
    The last section will deal with some policy implications from our perspective.The last section will address some of the political implications of our perspective.
    No Brasil, a mandioca (Manihot esculenta) é conhecida sob diversos nomes em diversas regiões.In Brazil, cassava (Manihot esculenta) is known under several names in different regions.In Brazil, manioc (Manihot esculenta) is known under different names in several regions.
    No sul do país, ela também se chama "aipim", no Brasil central, "maniva", "manaíba", "uaipi", e no norte, "macaxeira" ou "carim".In the south of the country, it is also called "casino" [sic], in central Brazil, "maniva", "manaíba", "uaipi", and in the north, "macaxeira" or "carim".In the south of the country, it is also called "aipim", in central Brazil, "maniva", "manaíba", "uaipi", and in the north, "macaxeira" or "carim".
    (Soentgena & Hilbert, 2016: 1145)
    Neste ensaio, pretendemos mostrar que, no que concerne ao conhecimento relativo às práticas químicas, a criatividade e a inteli- gência técnica dos povos indígenas da América do Sul, são compe- tências consideráveis até os dias de hoje.  
    Os povos ameríndios, em especial os da bacia amazônica, desenvolveram práticas que levaram a invenções das quais, até hoje, milhões de pessoas se beneficiam.
    In this essay, we intend to show that, with regard to knowledge related to chemical practices, creativity and technical intelligence of the indigenous peoples of South America are considerable competences to this day.  
    The Amerindian peoples, especially those from the Amazon basin, developed practices that led to inventions from which, to this day, millions of people benefit.
    In this essay, we intend to show that, to a certain extent, companies,
    a process of invention of the Indian Indians of South America, and
    still are considerable, as businesses, until today, millions of people and institutions benefit in the Western world.
    (Soentgena & Hilbert, 2016: 1141)
    Gostaríamos de documentar essas afirmações com alguns exemplos, limitando-nos a apresentar apenas produtos feitos a partir de substâncias que eram inteiramente desconhecidos na Europa.We would like to document these claims with a few examples, limiting ourselves to presenting only products made from substances that were entirely unknown in Europe.We seek to provide information from a few examples regarding chemical practices and biochemical procedures for the transformation of substances that
    are [sic!] unknown in Europe.
    (Soentgena & Hilbert, 2016: 1142)
    Text of da Silva's 2019 article (in its published sequence) is juxtaposed against material that seems to have been used as unacknowledged sources (paragraphs have been broken up to aid comparisons).

    Works cited:
    • Almeida, M. R., Martinez, S. T &  Pinto, A. C.(2017) Química de Produtos Naturais: Plantas que Testemunham Histórias. Revista Virtual de Química, 9 (3), 1117-1153.
    • Cardoso, A.M.C., Lobo-santos, V., Coelho, A.C.S., Ayres, J.L., Martins, M.M.M.(2017) O Processo de preparação da bebida indígena tarubá como tema gerado para o ensino de química. 57th Congresso Brasileiro de Química. http://www.abq.org.br/cbq/2017/trabalhos/6/11577-25032.html
    • da Silva, M. A. G. (2019) The Chemistry of Indigenous Peoples. Journal of Chemistry: Education Research and Practice, 3 (1), pp.1-2
    • Pinto, A. C. (2008) Corantes naturais e culturas indígenas: http://www.luzimarteixeira.com.br/wp-content/uploads/2010/04/corantes-curiosidades.pdf
    • Soentgena, J. & Hilbert, K. (2016) A química dos povos indígenas da América do Sul. Química Nova, 39 (9), pp.1141-1150

    Guiding the work of palliative care

    Keith S. Taber

    I recently heard from the journal 'Archives of Palliative Care' who claim to be able to "enhance the quality" of my work. As – to the best of my knowledge – palliative care is an area of medical work seeking to make life as comfortable as possible for the terminally ill, this is not a journal I've tended to read.
    From Editorial AssistantArchives of Palliative Care Call for paper: community engaged Dear Dr. Taber Keith S I enjoyed your recent paper with the title Secondary students' values and perceptions of science-related careers: responses to vignette-based scenarios. We would like to continue working in this area under your guidance. Would you please tell me whether you have any new manuscripts available in your area of site? Thank you for your time
    I have written back to see how the journal feels I can contribute…as surely Sherline would not have written to me to tell me she had read my work and feels it is relevant to her journal unless that is indeed true?
    Dear Sherline Thank you for your kind message. It was so good to hear that you enjoyed our article 'Secondary students' values and perceptions of science-related careers: responses to vignette-based scenarios'. It was quite a small piece of work arising from a larger collaborative project, but I was rather proud of it. It is always rewarding to hear that someone has found time to engage with the work and has got something useful from it. I was intrigued to learn that 'Archives of Palliative Care' is interested is working in this area under my guidance, as I do not think we would likely have considered the journal an obvious outlet for our work. I am not sure we have anything else worked up for submission at this time, but perhaps if you could tell me what aspects of 'Secondary students' values and perceptions of science-related careers: responses to vignette-based scenarios' you found especially relevant, and how you feel our work can best contribute to 'Archives of Palliative Care' then I could give some serious consideration to whether we might have anything yet to be worked up which it might be suitable. Best wishes

    Keith

      The article Sherline enjoyed does include some comments of young people reflecting on whether they would be comfortable in entering medicine as a career (as one of a number of focal areas of scientific work discussed in the study), but that link seems a little tenuous to think our research fits in a journal on palliative care. But perhaps Sherline will get back to me and enlighten me.

    Update:

    Sherline has indeed got back to me: On 15/07/2021 11:39, Archives of Palliative Care wrote:
    Dear Dr. Taber Keith S, Greetings!! Thank you for your immediate response towards our journal. The knowledge present in your published manuscript is so useful to future researchers . this was the reason we want to publish your manuscript in our journal. Awaiting for your response. Best Regards,
    This response remains at a very general level, indeed the kind of repsonses that Sherline could have made even if she was not an honest person, and had not even read the article ('Secondary students' values and perceptions of science-related careers: responses to vignette-based scenarios') she had claimed to have enjoyed so much. So, I remain unconvinced, but await clarification of how my work is relevant.
    Dear Sherline Thank you for your comments. It is obviously gratifying that you see so much of value in our work, and flattering that you want to publish a manuscript from me in your journal 'on spec' (that is, without even seeing what I might write). I imagine I could write something developing my thoughts further on this topic, but do you really feel that this would fit in your journal? (And would it not be a matter for referees to evaluate the relevance and quality of the work in peer review – or do you include invited papers?) Of course, I would like to contribute if that were viable, if I were to be persuaded that my work was relevant to your readers, but I am busy with other ongoing writing and despite your very kind evaluation of my recent work I would need some convincing that there really is a good fit with Archives of Palliative Care. Best wishes Keith
    Sadly, whilst my initial response to the invitation was that this was an entirely incongruent request as anything I could write would not be relevant to the journal, as I composed this response I started to actually think about how I could devleop something building on the the publsihed work which might exlpore how young people might feel about going to work in palliative care medicine… Perhaps there would be a role for me in enticing submissions for dodgy journals?

    Read about 'Secondary students' values and perceptions of science-related careers: responses to vignette-based scenarios'

    Read about journals and poor academic practice

    Read about more examples of illogical connections between published work and invitations from journals and conferences