In India the number of research papers published in fraud journals could be as high as 3,000 to 5,000 in a year.
Two years ago, a Pune-based PhD student who is enrolled at a university in Madhya Pradesh submitted a paper to a social science journal that promised to upload it online.
She only had to pay ₹5,000 to publish the paper, and immediately received a PDF file of her paper and a certificate via mail.
"But later when I searched online, I could neither find the paper nor the journal's Web site," says the 32 year old who does not want to be identified.
It's one of thousands of examples of researchers falling prey to fake, cloned or predatory journals -- a phenomenon that has become more rampant in a digital age and especially in a post-pandemic scenario where academia has shifted from the real to the virtual world.
Although there are no estimates available, the financial losses are immense.
Researchers often end up paying upwards of ₹20,000 for an article, says Sumit Narula, a whistle-blower, adding that in India the number of research papers published in fraud journals could be as high as 3,000 to 5,000 in a year.
Narula, who heads the Amity School of Communication at Amity University, Gwalior, was also cheated by a predatory journal in March 2020.
Since then, he has been tracking fraud journals and conducting workshops to help students and researchers identify them.
Predatory publishing is one that charges publication fees from authors without vetting articles for quality and legitimacy.
Researchers are tricked into publishing with them, even though some authors may be aware that the journals are of poor quality or fraudulent.
"Cloned journals are a counterfeit mirror of and exploit the title and ISSN (International Serial Standard Number) of authentic journals; some cloned journals actively chase authors who may be duped into paying an open-access publication fee," explains Narula.
He flags suspect cases to leading citation databases such as Scopus (owned by Dutch publisher Elsevier) and Web of Science (owned by analytics firm Clarivate), which identify and discontinue predatory journals.
In November 2018, the University Grants Commission (announced the setting up of a Consortium for Academic and Research Ethics (CARE) to promote quality research in universities.
It also led to the creation of a UGC-CARE List of journals. The UGC-CARE List now has two groups:
I. Journals found qualified through UGC-CARE protocols.
II. Journals indexed in globally recognised databases.
The dynamic list is updated quarterly and includes the corresponding cloned journals that mirror the authentic ones.
The latest list shows 36 journals in Group I and 40 in Group II.
Misguided or misinformed
"The problem of fraud journals persisted earlier but it became bigger since publishers went digital," says Sangeeta Menon, who heads South Asia publishing for academic publisher Emerald.
"The highest number of victims of predatory journals comes from South Asian countries, particularly India," she adds.
There are multiple reasons for this.
First, there is a lack of understanding of open/paid access research and predatory publishing among researchers.
"There is a lot of focus from the Indian government to have quality research output.
"So there is pressure that journals have to be Scopus indexed, with indicators of quality research, but researchers are sometimes unable to distinguish between genuine and false impact factors created by fake agencies.
"Because of these problems, we have seen a lot of predatory journals flourish."
Emerald signs up journals for open access and conducts workshops on it.
It calls itself a digital-first publisher with open access and doesn't put the onus of making payments on researchers. It is sponsored by institutes.
"Around 2010-11 publishers found themselves in a position where they had to go open access (which offers unrestricted access online to research)," says Menon.
"When we are working with a fairly misguided or misinformed market about open access (that it's all free or predatory), we as publishers are struggling because researchers are asking for open access from us," adds Menon.
She points out that India is among the top five nations when it comes to the output of scholarly publications, but its citation share is barely above 3 per cent.
"A big part of the problem is where our researchers are publishing. A large portion is lost because of predatory journals, which do nothing for researchers," she says.
"On the other hand, when researchers can't afford open access, even if they publish in a good journal, it's behind paywall so there is no visibility. As a digital-first publisher, our effort is to change this."
Researchers in top-rung universities and institutes find better guidance and are less prone to fake journals, although it is a challenge across tiers.
Suman Chakraborty, a faculty member at the Indian Institute of Technology-Kharagpur, agrees that IITs don't get affected much due to their autonomy.
IIT-Kharagpur, for instance, has a list of top journals and conferences set up by independent committees that ensure quality even though it may not be foolproof, he says.
While the UGC's objective was good as it sought to maintain a minimum standard of what is published, various universities under it do not necessarily focus on quality indicators, adds Chakraborty.
"I think the onus should be on heads of institutes and departments.
"Whenever we get applications, we may keep some metrics such as number of publications as a necessary condition.
"But there are empowered committees that look into details in terms of quality, novelty and impact. That's not there in many institutes."
The best universities abroad, he adds, rely on peer reviews.
"They send papers to a sufficient number of peers. If somebody is good, on an average there is a chance that it will be a fair assessment. That is a better approach than targeting only UGC-recognised publications."
Quacquarelli Symonds (QS), the British company that analyses higher education and issues global university rankings annually, cautions academics and students against engaging with any approach that appears to offer accelerated advantages without diligent verification.
Asked whether it has noticed universities focusing more on quantity of research than quality in a bid to improve their rankings, Ben Sowter, research director and senior vice president, QS, says that a majority of the metrics used in rankings are "proxies for quality or influence at scale, rather than purely of volume".
"However, in trying to build a more research-intensive culture, institutions themselves often create their own incentives -- sometimes financial, sometimes performance measures -- that are designed to develop a culture of productivity first, before intending to move to a more quality focus down the track," he says.
"Though well-intentioned, these incentives can apply a degree of pressure and may have the unintended consequence of exposing vulnerabilities."
As a rankings agency, QS relies mainly on the content policy and quality assurance processes of its bibliometric partner, Scopus/Elsevier, while evaluating research published by universities.
Feature Presentation: Rajesh Alva/Rediff.com