Hello fellow journalologists,
Scholarly publishing tends to move at a glacial pace. But not this week.
On Thursday Learned Publishing released a paper entitled “The rise of a mega-journal in public health publishing”, which focused on the rapid growth of an MDPI journal, The International Journal of Environmental Research and Public Health (IJERPH).
Then, a few hours later I was alerted to a post on LinkedIn by Christos Petrou about IJERPH’s rapidly shrinking article output. Christos wrote:
Following the delisting of IJERPH from WoS, the journal shrunk by 85% in comparison to its peak, meaning that the second largest academic journal in 2022 (17.5k papers) is turning into a sizeable, yet considerably smaller journal (>1k papers annually).
The delisting appears to have affected MDPI more broadly: excluding IJERPH, MDPI shrunk in the last three months by 22% against its peak (Dec-2022 to Feb-2023), averaging 23k monthly articles from 29k articles.
By way of a recap, the International Journal of Environmental Research and Public Health was the second largest journal in 2022; it was delisted from the Web of Science (WoS) by Clarivate back in March (see Journalology #22: Delisted) and so didn’t receive a journal impact factor (JIF) in June. It seemed highly likely that article output would decrease as a result.
MDPI journals tend to publish papers very quickly (41 days from submission to publication on average), which means that their pipeline is short. As a result, a drop in submissions becomes quickly apparent in the article output. Or at least that’s what I presumed had happened when I first read Christos’ LinkedIn post.
Christos is preparing an article on this and I don’t want to steal his thunder, so I’m not going to cover MDPI’s entire portfolio here. However, I was impatient to find out what had happened to IJERPH and so I plotted this graph (source: PubMed; filtered to articles with Abstracts), which shows article output over time. The July data run through to July 24th.
Article output dropped suddenly in March 2023, which is much faster than I would have expected if a fall in submissions was the underlying cause. Here’s an excerpt from a March 22 MDPI announcement:
On March 15, 2023, we received note that Clarivate has discontinued coverage of IJERPH and JRFM in Web of Science as of 13 and 22 February respectively.
It seemed strange that article output fell in March when the delisting only became public on March 22, so I plotted a graph of article output in IJERPH on each day in March 2023.
It looks as though MDPI voluntarily put a brake on publications immediately after being informed by Clarivate (on March 15) that the journal had been delisted. To confirm this, I contacted the MDPI CEO, Stefan Tochev, yesterday who kindly replied within a few hours. He said:
In mid-March, we were notified that IJERPH was delisted from The WoS, effective 13 February. This sudden notification disrupted our journal's workflow, impacting pending articles under review and in preparation. We had to temporarily halt these processes to gather additional information and notify authors whose papers were placed on-hold for processing. The authors then had to decide whether to proceed with publishing in IJERPH or seek alternative options. Although Clarivate’s decision, without prior warning, took us by surprise, we quickly responded by notifying authors and addressing stakeholder concerns. While this outcome is disappointing for IJERPH, as well as for our authors, academic editors, and the entire scientific community supporting our journal, we see it as an opportunity to reflect and prepare for the future direction of the journal.
In the first two months of 2023 IJERPH published around 60 articles with abstracts per day (MDPI publishes 7 days a week). That number fell to around 11 per day in April, May and June. This gives an indication of the importance of JIFs to academics. It looks as though 5 out of every 6 authors walked away when told by MDPI that the journal they had submitted to had been delisted from the Web of Science.
IJERPH was not delisted because of paper mills or large-scale misconduct. Rather, it published papers that were outside of its editorial scope. The MDPI announcement said:
Clarivate found that both journals failed the Content Relevance criterion, highlighting publications that were deemed outside the scope of the journal, and not to do with quality of the publications. On 31 March 2023, we formally appealed the decision to discontinue IJERPH in Web of Science.
A month ago, MDPI published IJERPH: Looking Towards the Future, which announced that Clarivate had rejected MDPI’s appeal to reverse the delisting.
It’s true that the IJERPH editorial team published papers that had very little to do with the journal’s aims and scope. At the start of January, 2023, IJERPH (which, remember, covers environmental research and public health) published the following articles:
(I selected these three articles by eyeballing the content published at the start of January).
In summary, MDPI acted promptly when Clarivate informed it that IJERPH had been delisted. The company immediately paused publication of IJERPH, informed authors that the journal would lose its JIF, and gave them the option to withdraw their paper. This does not strike me as the action of a publisher that’s ‘predatory’. It’s also worth remembering that MDPI waives 25-27% of APCs (article processing charge), far more than many mainstream publishers.
However, some of the articles published in IJERPH had very little to do with either environmental research or public health. Under an author-pays business model, the wider community is likely to be skeptical of a journal that takes an APC for publishing a paper that has little to do with the journal’s stated topic.
Furthermore, the special issue editorial model has allowed MDPI, and other publishers, to scale rapidly, but devolving editorial decisions to guest editors poses significant risks to a journal’s reputation.
Reputation is hard won and easily lost. For all the discussions around DORA, the IJERPH case study demonstrates how important JIFs are to academics and the perils of publishing articles outside of a journal’s editorial scope.
News
At least two-thirds of the editorial board of Wiley’s Journal of Biogeography have resigned, citing the publisher’s push towards “exorbitant” open access fees and what they claimed was a policy to steer rejected manuscripts to other titles. Former editor-in-chief Mike Dawson announced his resignation in June and 64 of his associate editors have been refusing to handle new manuscripts since then, part of an increasing trend of journal editorial boards deciding to take action en masse.
Times Higher Education (Ben Upton)
Dawson said he suspected an email he sent informing authors who had submitted to the journal during the work stoppage that processing their manuscripts could be delayed sparked Wiley’s decision to fire him. He said Wiley initially attempted to fire him by email, effective immediately. He pointed out that termination without notice was in breach of his contract, and later that day received a letter giving him 30 days notice. A Wiley spokesperson confirmed the dates of his resignation and termination but would not otherwise comment on the staffing changes.
Retraction Watch (Ellie Kincaid)
Some programs, such as Consensus, give research-backed answers to yes-or-no questions; others, such as Semantic Scholar, Elicit and Iris, act as digital assistants — tidying up bibliographies, suggesting new papers and generating research summaries. Collectively, the platforms facilitate many of the early steps in the writing process. Critics note, however, that the programs remain relatively untested and run the risk of perpetuating existing biases in the academic publishing process.
Nature (Amanda Heidt)
The political and societal move towards Open Science is well established as the demand for access to all publicly funded research intensifies. The retention of the 12-month embargo, as the Appropriations Bill stipulates, is a retrograde step within the global context of research and access to scientific literature.
Fully OA Publishers (letter signed by executives from 8 organisations)
STM has released the first part of a new SDG Roadmap to guide publishers large and small in implementing the U.N. Sustainable Development Goals (SDGs) and supporting sustainability more broadly. This dedicated toolkit of resources for scholarly publishers takes the potentially overwhelming framework of the SDGs and breaks it down into concrete steps. The roadmap offers practical suggestions, starting with small steps like signing the SDG Publishers’ Compact and sending a questionnaire to gather your colleagues’ views about SDGs. Later, it guides you through selecting the SDGs that are most relevant to your organisation, nominating individual(s) to coordinate SDG efforts and more.
STM announcement
JB: Publishers are increasingly engaging with the UN Sustainable Development Goals and this roadmap provides a list of activities to help STM members get started. For me, the most important step is the final one, “Report In”. There’s a risk that the SDGs Publishers Compact becomes a corporate PR tool rather than a structure to help researchers communicate solutions to global society’s biggest challenges. Regular public reporting on actions taken is the best way for publishers to stay on track and deliver in a meaningful way to the SDGs.
ReviewerCredits sits at the forefront of developments in peer review (and therefore publishing) as we provide a clear and innovative solution to verification, reward via credits, and recognition. Our unique 2-factor authentication, including via ORCID iD integration, means that researchers signed-up and verified on ReviewerCredits ‘are who they say they are’ and can be matched via our unique Reviewer Graph with editors and journals to enhance their careers first through peer review and then rewards and recognition.
ReviewerCredits (Sven Fund)
Community Corner
Last week’s testimonial came from a new CEO. This week’s contribution comes from a new editor. When I write this newsletter I always have journal editors in mind. If other members of our community enjoy Journalology that’s great, but my goal is to help editors better understand the rapidly changing environment that they’re operating in. With that in mind, this testimonial made me smile when I first read it. Thank you for taking the time to leave a comment on the Community Wall, Shalini.
Opinion
On March 8, we celebrated International Women's Day to raise greater awareness of women's rights. The Lancet Series on breastfeeding could not have been published at a better time… But the Series perfectly illustrates our struggle, particularly on “the role of actors, interests, and systems of power in shaping infant and young child feeding patterns and outcomes”. Most authors in the Series are White men from high-income countries. No woman is in a leadership position in the Series; men led all three publications. The authorship in the Series represents these systems of power, but the authors offer no reflection on this point.
The Lancet (Mélissa Mialon)
The authors of the 2023 Lancet Series on breastfeeding replied:
However, her assertion that the majority of authors of our Series are White men from high-income countries is factually incorrect. More than half (14/25; 56%) of the Series authors are women, and we represent 11 countries across six continents. Our paper leads oversaw a writing process that was unquestionably inclusive, and our voices are strongly reflected in every aspect of the Series. The men on the team were—and continue to be—committed allies and champions of women's rights.
JB: When I was an editor at The Lancet the journal published 8 pages of Correspondence each week (as I recall). This was essentially post-publication peer review. The journal doesn’t publish as much Correspondence now, presumably because of page budget pressures. It’s a shame that more journals don’t provide a venue for readers to debate issues and hold authors (and editors) to account for what was previously published.
In addition to a clear and compelling title, I cannot emphasize enough how important the table of contents (ToC) image is for busy people to unconsciously decide to stop skimming and click on your paper. It is a fast mental decision tree to go from “Hey, that looks interesting!”, to the critical click that links the reader to your paper. A clear and interesting ToC image that conveys the key message of your manuscript will attract the attention of a potential reader, and draw them to your hard-earned publication so that they click on the link to your paper, and read it. Please keep in mind when assembling your ToC image that it needs to be legible not only on a computer screen or monitor, but also on a mobile device.
ACS Nano (Jillian M. Buriak)
Addressing One Health research questions often requires combining tools and approaches from different areas of science. As the scientific field of One Health research develops, it is also worth considering whether more formal frameworks for these studies could be adopted. For example, are there specific requirements for studies to be classified as One Health research, how can One Health considerations be best built into study designs, and how can evidence from different domains be appropriately triangulated to support actions. As a multidisciplinary journal, Nature Communications welcomes submission of articles across the spectrum of One Health science. We also encourage submissions aiming to develop best practices in the conduct and reporting of these studies.
Nature Communications (unsigned editorial)
Surgical journals use videos for educational and promotional purposes. YouTube is a suitable social media platform for sharing videos of journal content. The Surgery journal experience on YouTube can be used to learn important information on the nature of video content, the measurement of performance, and the benefits and challenges of using YouTube to disseminate journal content. Video content can be created to deliver information and infotainment. The online performance of videos can be measured using various metrics on YouTube Analytics, including content views and engagement metrics. There are several benefits to the use of YouTube videos by surgical journals, including the dissemination of reliable information, language versatility and diversity, open access and portability, increased visibility for authors and journals, and the humanization of the journal interface.
Surgery (Ameera J.M.S. AlHasan)
AI can leverage scholarly content to create a countless number of products and the more use cases that can be created the more value it will have. It behooves publishers to make their content as relevant as possible in their respective disciplines and consider how they want to be part of the move towards a Generative AI world. If our curated content is as valuable as I believe it can be when properly leveraged, might we even see GenAI companies looking at scholarly publishers as potential targets of acquisition?
The Scholarly Kitchen (Avi Staiman)
Editors recognize the inequity of $APCs, especially at oligopoly rates. These $APCs are priced well above the cost of production. Where editors once balked at the subscription price of the journals they labor for, they now protest the $APC. Oligopoly profits are directly tied to volume of articles and $APCs. Which means that editors are now pushed into a role that directly generates revenue. Some of the defecting editors note pressure to increase the number of acceptances. And a former editor of Journal of Political Philosophy links that phenomenon to “open-access publishing agreements”, more widely known as “transformative agreements.” Editors reject being turned into invoice-generating, yes people, but this is where the logic of capital has brought us.
By Every Means Necessary (Dave Ghamandi)
As peer-review is such a fundamental pillar of science, we hope that this study encourages the research community to further explore how AI is changing peer-review itself. We have open-sourced our codebase (
https://github.com/uzh-rpg/authorship_attribution) in the hope that it serves as a starting point for scholars to pick-up our work and build on top of it. Authorship attribution and plagiarism detection are vital to ensure the continued integrity and trustworthiness of academic publishing, and enhancing it will be beneficial to the entire scientific community.
Impact of Social Sciences (Leonard Bauersfeld, Angel Romero, Manasi Muglikar and Davide Scaramuzza)
The Contributor Roles Taxonomy (CRediT), devised in 2015, is one. This breaks down contributions into 14 categories: conceptualisation, (provision of) resources, data curation, (writing) software, formal analysis, supervision, funding acquisition, validation, investigation, visualisation, methodology, project administration, writing the original draft (including translation) and reviewing and editing it (both before external peer review and after it). That taxonomy is now an American national standard. Such transparency answers a criticism of more traditional approaches to authorship. Some journal articles, in areas such as high-energy physics and clinical medicine, have hundreds of authors attributed to them. How is it possible to know who contributed what in this scenario?
Times Higher Education (Paul Ayris)
JB: I wonder why organisations like ICMJE have stuck with traditional definitions of authorship and have not adopted CRediT, which is a better way of making roles and responsibilities transparent.
Tenzing was launched in 2020 to help researchers to record their roles in a project from the start using the CRediT system. It is particularly helpful at the time of manuscript submission to a journal, as a researcher might face certain obstacles (such as author labor) during the process of accumulating information about every author that was engaged in the project from the beginning. However, there is no one-size-fit-all approach to all problems. Although the application of these tools seems clear for researchers and publishers, there is still much to learn about how CRediT could be implemented by funders directly.
DORA (Queen Saikia)
JB: I hadn’t heard of Tenzing before. The tool can be found here and an explanatory PLOS ONE paper (from 2020) can be found here.
This is a concise overview of how to go about crafting a news release about research findings, but most of the rules here should be viewed more as guidelines. Yes, a news release must be honest and accurate about the research—that is nonnegotiable. On the other points, there is often room to maneuver. For example, you can use more technical language when writing about work that may be of interest almost exclusively to news outlets that focus on discipline-specific audiences. And it is okay to have fun with the subject, as long as the researchers are on board and you keep your target audiences in mind.
Science Editor (Matt Shipman)
Many publishers currently work to accommodate transparent and reproducible research and to evolve with changing needs and mandates. This support frequently comes in the form of publishing guidelines and checklists (e.g., STAR, Nature checklist). However, many publishers encourage depositing details or additional documents in supplemental files, which introduces discovery, reuse, and citation challenges. For example, moving detailed research methods or protocols to a supplemental file inhibits another researcher from applying the methods to a new data set and limits the citation of the original protocol. More than just presenting challenges for reuse, these practices limit science and the acceptance of these materials as first-class research objects. Publishers must further their push for research transparency and move beyond just data into the sharing of complete methods, interactive models, code, and software.
Frontiers in Research Metrics and Analytics (Leslie D. McIntosh and Cynthia Hudson Vitale)
Journals and publishers also fail to do their part, finding ways to ignore criticism of what they have published, leaving fatally flawed work unflagged. They let foxes guard the henhouse, by limiting critics to brief letters to the editor that must be approved by the authors of the work being criticized. Other times, they delay corrections and retractions for years, or never get to them at all.
The Guardian (Ivan Oransky and Adam Marcus)
After in-depth conversation and planning with our JU editorial leadership team at our most recent annual meeting, AUA2023, in Chicago, we unveiled an important initiative for JU: the Pathway to Publication (P2P). Top-rated abstracts submitted for AUA2023 were further adjudicated by members of JU’s senior editorial team, and invitations were sent to authors to accelerate submission of the resulting manuscripts for simultaneous publication at the annual meeting. Editorial, reviewing, and statistical teams prioritized their work to diligently assess the papers. Although not all the invited manuscripts were ultimately successful, those who had their JU submissions accepted had their articles publish online at the precise time of their presentation in Chicago as well as press and social media releases showcasing their work.
Journal of Urology (Jennifer Regala and D. Robert Siemens)
JB: This editorial contains the first citation (in a scholarly journal) to the Journalology newsletter. Annoyingly, the citation was to a newsletter published in 2023, so although Journalology now has an immediacy index its JIF will remain at zero. Since JIF is everything, in the future please wait at least a year before citing Journalology. Thank you.
When honest mistakes are made—such as when Science failed to post two corrections in 2015 that were approved by its editors in the Tessier-Lavigne case— it is difficult to dispel suspicions that journals or authors are trying to hide something. Honest mistakes happen, and journals need to be accessible and on the record about their behaviors. Issuing carefully worded statements and “no comment” has no place in a generative culture. Meanwhile, although there have been good recent discussions about universities and journals working together to accelerate corrections and retractions, the universities need to realize that threats of litigation may not be the major consideration when so many within and outside the scientific community are losing trust in science. Rather, a healthy generative culture would direct universities to be forthcoming with information and encourage journals to correct the record.
Science (H. Holden Thorp)
In Figure 1, we can see that the number of articles that mention ‘prosecco’ has steadily grown in the last decade, with a pronounced increase in 2021. However, this seemed to tail off in 2022, so perhaps interest in the topic has started to wane. This almost exactly mirrors global sales of prosecco and Italian wine in general, which have tailed off in 2022 after performing well during the pandemic.
Digital Science blog (Simon Linacre)
JB: Bibliometrics is never dull…
Journal Club
The evidence above conclusively shows that AI is useful for helping to find reviewers and the spread of this technology to other contexts where it is not yet used, such as REF reviewer assignments, is recommended. There is also evidence that AI can sometimes support the initial quality control of submitted manuscripts. Although plagiarism detection is the obvious example, and is presumably widely used by publishers, statistical checking also seems useful and extending the capability of such software would be valuable. In contrast, there is insufficient evidence yet to use AI to support reviewing and it should not be used to replace human reviewers.
Learned Publishing (Kayvan Kousha and Mike Thelwall)
This cross-sectional study found that one-third of oncology clinical trials reported results in at least 1 of 3 platforms (
ClinicalTrials.gov, publications, or ASCO Annual Meetings) within 1 year of completion and just over half within 3 years. NIH-funded trials had higher results-reporting rates compared with trials sponsored by other funders. Results were more likely to be reported on
ClinicalTrials.gov compared with in publications or at ASCO meetings. Given the importance of detailed results reporting and peer review facilitated through journal publication, our results suggest that efforts may be needed to understand low rates of publication observed.
JAMA Network Open (Jennifer Kao, Joseph S. Ross and Jennifer E. Miller)
Open data is receiving increased attention and support in academic environments, with one justification being that shared data may be re-used in further research. But what evidence exists for such re-use, and what is the relationship between the producers of shared datasets and researchers who use them? Using a sample of data citations from OpenAlex, this study investigates the relationship between creators and citers of datasets at the individual, institutional, and national levels. We find that the vast majority of datasets have no recorded citations, and that most cited datasets only have a single citation. Rates of self-citation by individuals and institutions tend towards the low end of previous findings and vary widely across disciplines. At the country level, the United States is by far the most prominent exporter of re-used datasets, while importation is more evenly distributed. Understanding where and how the sharing of data between researchers, institutions, and countries takes place is essential to developing open research practices.
arXiv (Geoff Krause, Madelaine Hare, Mike Smit, Philippe Mongeon)
And finally...
By chance, I stumbled across this quote from my old boss in my archives. Richard wrote it 5 years ago. He was ahead of the curve, as usual.
Until next time,
James
P.S. If you got this far presumably you enjoy reading Journalology. Please forward this newsletter to your colleagues or leave a testimonial on the community wall.