Journalology #27: Unionisation



Hello fellow journalologists,

I always enjoy hearing from readers and last week a Journalology subscriber sent me a link to a preprint entitled To Be Scientific Is To Be Communist. The authors argue that research done by commercial entities, such as pharmaceutical companies, is not “engaging in properly scientific inquiry”. The authors go on to say:

… our argument gives support and impetus to the Open Science movement. By making science cheaply and widely available, by facilitating more openness about what is being done and why, and by making it possible for a greater variety of people to participate, Open Science initiatives embody the definitive feature of science as such. Open Science is a step towards the better realization of communism within scientific communities, which is just to say it represents science achieving its highest ideal.

PLOS has been a driving force in the open research movement over the past two decades, so I was interested to hear from another reader yesterday who sent me a link to this twitter thread.

Unions have an important role in society by advocating for the rights of their members. Here are a few quick observations that the new PLOS Union may want to consider during the collective bargaining process.

(1) The latest set of annual accounts for PLOS (for 2021; the 2022 annual report has not been published yet) state that PLOS generated revenues of $37.7 million in 2021 and incurred costs of $32.5 million (and therefore generated a surplus of $5.2 million). According to Form 990, PLOS had 156 employees in 2021 and the total cost of employment was $19.9m ($128,000 average cost per employee; see page 10 for a breakdown of the costs). The organisation had $16 million in cash and unrestricted investments, which means that if revenues ceased abruptly they would only be able to cover costs for ~6 months before struggling to meet payroll.

(2) When Wiley announced that they were pausing special issues in Hindawi journals because of problems with papermills the company predicted that its revenues would drop by $30 million in the 2023 fiscal year. In other words, it is entirely possible for revenues to abruptly fall off a cliff in fully OA journals; financial reserves are incredibly important.

(3) The PLOS journals published 36,500 articles in 2013, but only published 20,500 last year (the graph below is from Digital Science’s bibliometric tool Dimensions; the y-axis is number of articles). Revenues for 2022 are unlikely to have increased much on 2021, since article volumes were flat, but costs probably increased because PLOS has launched new journals and therefore presumably hired more staff.

Meanwhile, new OA competitors, such as MDPI and Frontiers, are growing exponentially and are creating an increasingly competitive marketplace, making it harder for PLOS and other medium-sized publishers to thrive (the below graph is from Dimensions; the y-axis is number of articles).

(4) I submitted a peer-review report for a PLOS journal on Friday and the user experience was poor. The manuscript tracking system only allowed plain text: hyperlinks disappeared (Grrr…). A best-in-class user experience is increasingly important to attract submissions.

(5) The PLOS management team has to balance the personal financial requirements of their staff, many of whom live and work in one of the most expensive cities in the USA, with the need to invest in new staff and technologies that (a) reduce the risk of publishing ethically dubious research, which could cause a catastrophic loss of revenue; (b) improve the author experience (perhaps by using these six strategies) so that more authors submit to PLOS journals, to increase revenues and improve the organisation’s financial resilience so that it can fulfil its mission. This is a significant management and leadership challenge and the margin for error is small. Open communication between all stakeholders will be vitally important.


Pippa Smart is hosting a Editorial School for Journal Editors over a 4 week period starting on May 31. If you have editors on your journals that need some additional support, this is a great way to improve their knowledge and skills.

Another way to help your editors learn about scholarly publishing is to encourage them to subscribe to this newsletter.


Briefly quoted

Saving time and money in biomedical publishing: the case for free-format submissions with minimal requirements

Among the analyzed journals, we found a huge diversity in submission requirements. By calculating average researcher salaries in the European Union and the USA, and the time spent on reformatting articles, we estimated that ~ 230 million USD were lost in 2021 alone due to reformatting articles. Should the current practice remain unchanged within this decade, we estimate ~ 2.5 billion USD could be lost between 2022 and 2030—solely due to reformatting articles after a first editorial desk rejection.

BMC Medicine (Amy Clotworthy et al)


Article that assessed MDPI journals as “predatory” retracted and replaced

A 2021 article that found journals from the open-access publisher MDPI had characteristics of predatory journals has been retracted and replaced with a version that softens its conclusions about the company. MDPI is still not satisfied, however.

Retraction Watch


Is the Essence of a Journal Portable?

But it also seems possible – hear me out, now – that in the case of NeuroImage, neither authors nor subscribers will pay much attention to the departure of the board, either in the short or in the long run. Maybe for most authors, the brand name of NeuroImage depends primarily on things other than the makeup of the editorial board (cough Impact Factor cough), and as long as they feel reasonably confident that the board remains competent, they will still feel very much that they’re submitting to the same NeuroImage they always have. Maybe authors and subscribers alike will simply proceed with business as usual, assuming that Elsevier is fully capable of replacing the old editorial board with another very good one and is very likely to do so. The question here isn’t whether having an editorial board matters; the question is how much the particular individuals who make up a particular editorial board matter – both to the actual quality of the journal and to its desirability as a publication outlet in the minds of authors.

The Scholarly Kitchen (Rick Anderson)


Reviewer Training as a Form of Engagement

It might be time for journals to honestly reframe the conversation as a crisis. Most researchers already view peer review as burdensome, albeit one that is necessary. They might not know how hard it has become, however, for journals to find good reviewers. Journals would be well served, therefore, to not only reveal the extent of the problem but also offer ready-made solutions that are meaningful beyond reviewer “thank you” acknowledgements. Developing training programs may sound like a huge effort, and it could be if you go all in with didactic lecture series, mentor-driven seminars, and resource manuals, but it does represent an interesting solution. Equally, your efforts may be as simple as a virtual question-and-answer session with an editor via a webinar or at the annual society meeting. Whatever you attempt, don’t just build something and expect individuals to come. Most won’t. Consider incentives and invest time in your messaging. Explain the benefits. Convey how, for journals and reviewers alike, investing effort in reviewer training benefits everyone in the long term. Use training efforts as an approach to enhance the journal brand: one that invests in its community, one that demands quality but provides pathways to success for all and one that is inclusive.

Origin Editorial (Jason Roberts)


Smorgasbord: Trends from the Spring 2023 Meetings and Conferences

Citation performance in OA journal articles has continued to decline over recent years toward the level of non-OA articles. But the decline has been much more rapid in fully-OA articles than in hybrid-OA. Much of this is attributed to the strong growth in fully-OA articles being published (growing far faster than hybrid-OA). While this may sound like a negative, to me it is indicative of the normalization of OA. Hybrid-OA, where authors have a choice not to pay, remains the domain of the wealthy that can afford it. But fully-OA seems to have become increasingly democratized and its average level of performance looks more and more like the average level of performance everywhere else. The declining bias is as clear a sign as possible of the mainstreaming of OA publishing.

The Scholarly Kitchen (David Crotty)


PLOS Authorship policy update: Adopting a more inclusive standard

On May 10, 2023, PLOS updated our Authorship policy. Among other changes, we updated our authorship criteria: all PLOS journals except PLOS Medicine now apply the authorship criteria put forth in a 2018 PNAS article by Marcia McNutt et al. (PLOS Medicine is an ICMJE member and continues to apply ICMJE authorship criteria.)

The Official PLOS Blog


Peer review is broken. Paying referees could fix it

Indeed, it is increasingly difficult to make the case that even a permanent academic salary compensates for reviewing work given the huge workloads that modern academics already face. A poll by Nature in November 2022 found that almost one third of researchers had cut back on peer review, mostly because of work pressure. Meanwhile, academics despair of lengthening review times, while there is mounting evidence, beyond the anecdotal, that it’s becoming more difficult for editors to find peer reviewers. This is leading to talk that the peer-review system is fundamentally broken. Something clearly has to give.

Times Higher Education (Duncan Money)


Retractions should not take longer than two months, says UK Parliament committee

A new report from a UK Parliament committee calls for scientific publishers to correct and retract papers much quicker than they currently do, for the sake of research integrity and reproducibility. The Science, Innovation and Technology Select Committee of the House of Commons issued its report today, following an inquiry to which Retraction Watch and one of our cofounders, Ivan Oransky, provided evidence. Many others also gave evidence, including sleuth Dorothy Bishop.

Retraction Watch (Ellie Kincaid)

Report: Reproducibility and Research Integrity - Science, Innovation and Technology Committee


Reproducibility and Research Integrity top UK research agenda

The report says that publishers, funders, research organisations and researchers all have a role to play in improving research integrity in the UK, with the focus on reproducibility and transparency. It recommends that funding organisations should take these factors into account when awarding grants and that the Research Excellence Framework (REF), a method for assessing how to share around £2 billion of funding among UK universities based on the quality of their research output, should score transparent research more highly. Publishers are advised to mandate sharing of research data and materials (e.g. code) as a condition of publication and to increase publication of registered reports (published research methodologies that can be peer reviewed before a study is run) and confirmatory studies.

TL;DR - Digital Science (Simon Porter)


AI and Scholarly Publishing - A (Slightly) Hopeful View

There’s a lot of process-driven activity in publishing, and peer review is just one place where there are many opportunities for increased efficiency. Considering the size of the content datasets available at most publishers just on their own journals and books, using machine learning to search for relevant reviewers, or assess whether a manuscript’s topic is suitable for a given journal, seems to make sense… That being said, making any editorial decisions based solely on Machine Learning has all the potential pitfalls of baking in existing bias from the data it has been trained on, and so it’s unlikely to ever be the only information that Editors should consider when making publishing decisions.

The Scholarly Kitchen (Emma Watkins)


Fake scientific papers are alarmingly common

When neuropsychologist Bernhard Sabel put his new fake-paper detector to work, he was “shocked” by what it found. After screening some 5000 papers, he estimates up to 34% of neuroscience papers published in 2020 were likely made up or plagiarized; in medicine, the figure was 24%. Both numbers, which he and colleagues report in a medRxiv preprint posted on 8 May, are well above levels they calculated for 2010—and far larger than the 2% baseline estimated in a 2022 publishers’ group report.

Science (Jeffrey Brainard)

JB: I took this news story at face value when I read it, but then I looked at the preprint which has significant methodological problems. Reader beware!


And finally...

Thank you for reading to the end. I plan to start to collect feedback on the newsletter to use in marketing material. If you enjoy the newsletter and would be willing to say something nice about it publicly, please do drop me a line.

Until next time,

James

Journalology

The Journalology newsletter helps editors and publishing professionals keep up to date with scholarly publishing, and guides them on how to build influential scholarly journals.

Read more from Journalology

Subscribe to newsletter Hello fellow journalologists, This newsletter is nearly 2 years old now and, as a result, I’ve been revisiting topics that recur every year. In issue 34 I covered the 2022 Springer Nature annual report. In today’s newsletter I delve into the 2023 report. It’s rare to read an annual report from an organisation (any organisation) that doesn't contain spin, and the Springer Nature report is no different. However, there are some interesting nuggets in there that I’ve...

Subscribe to newsletter Hello fellow journalologists, This week’s newsletter discusses Wiley’s new(ish) CEO (and his remuneration package), flat fee institutional models, and whether big brand journals harm research. I also assess the potential for MDPI’s new journal, Pets. So there’s lots to look forward to, but first a message from this week’s sponsor. Thank you to our sponsor, Digital Science Exciting news from Digital Science! We are proud to announce the launch of the Altmetric Journal...

Subscribe to newsletter Hello fellow journalologists, On July 4th UK citizens voted for a change of government and made a lawyer from a working-class background Prime Minister. Meanwhile, on the other side of the Atlantic, our US friends celebrated kicking the Brits out so they could make their own electoral decisions, while worrying what might happen if a billionaire with a criminal record gets re-elected. Today, French citizens are voting in a pivotal election that’s being observed closely...