11 May 2026

Visibly vanishing

Featured

Berusaha

Bentuk budaya kerja melangkaui petunjuk prestasi dan ganjaran Oleh Che...

Infusing

Why lifelong learning is no longer optional By Professor Dato...

Blue

Harnessing the next frontier, the ocean By Professor Dato Dr...

Historic

Why Malaysia needs a natural rubber museum By Professor Dato...

Complementing

Why circularity wonโ€™t stop GDP growth By Professor Dato Dr...

Share

By Mohammad Tariqur Rahman

Amidst the dictum โ€œpublish or perish,โ€ a new vibe has emerged in academia: โ€œbe visible or vanishโ€. The new dictum is introduced in the book โ€œEngage, Influence and Ensure Your Research Has Impactโ€ by Inger Mewburn and Simon Clews in 2023.

The survival of academics in their profession is largely dependent upon the number of papers they publish. An increasing number of papers in their bags adds credit to their reputation.

To have a higher prestige, the number of papers alone does not suffice. Papers need to be published in journals with high impact factors.

Arguably, the race to increase the number of papers resulted in a number of scientific misconducts, namely but not limited to the unethical practice in authorship assignments e.g., guest and honorary authorship; emergence of paper mills; and publishing unauthenticated or manipulated results.

The trend of scientific misconduct has been condemned, yet no practical measures were taken either to control or to decrease it. Rather, the increasing number of retracted papers every year attest the ongoing โ€œpandemicโ€ of scientific misconduct.

Will the new dictum โ€œbe visible or vanishโ€ then add to the pandemic?

Visibility in academia is generally measured by the number of citations received by the papers of an academic. Indeed, the number of citations increases with the number of publications. However, some may have more citations than others, with less papers. Nevertheless, researching a popular topic increases the chance of higher citations.

Self-citation, i.e., when authors cite their own papers, can be monitored by most of the bibliometric databases such as Scopus or Web of Science. However, the practice of self-citation is not acceptable when the authors cite their own papers, especially if they are not relevant and important. Using Scopus records, a PLOS One paper in December 2023 identified Colombia, Egypt, Indonesia, Iran, Italy, Malaysia, Pakistan, Romania, Russia, Saudi Arabia, Thailand, and Ukraine among the top anomalous self-citing countries (i.e., academics from those countries) in the world.

Citing existing literature is an academic norm that reflects the relevance of new research findings i.e., portray its rationality, validity, and importance in academic publications. Furthermore, the number of citations provides the impact (and popularity) of the published paper. Albeit, while the โ€œnumberโ€ of citation provides the visa for visibility of paper among the global audience, it does not necessarily represent the paperโ€™s importance.

For example, one of the most cited papers (>305,000 in 2014) in the history of academia goes to a paper describing how to quantify proteins in a solution. Even one of the most groundbreaking publications in the field of life science, i.e., the DNA sequencing method (>65000 in 2014) that claimed the Nobel prize and led to complete human genome sequencing, did not have any match to the citation of the protein quantification paper.

Needless to say, a large number of research publications remain behind the curtain without being cited. Former Harvard president Derek Bok, in his book โ€œHigher Education in Americaโ€ (published in 2015) noted that a majority of articles published in the arts and humanities (98%) and social sciences (75%) are never cited by another researcher. The current trend is not expected to be very different from this.

That brings an imperative question to answer, does a low (or no) citation make a research less (or not) useful?

Say, a researcher might be interested (or find it important) to research a very rare disease affecting less than 0.1% of the global population. Compared to cancer research, research on such a rare disease will have very low citations. Again, receiving a high number of citations will be unlikely for a research publication addressing a national issue than a global issue.

Those two examples, suffice to endorse that the number of citations would fail to reflect the importance of research publications. Rather it would be wrong if citation is used as a measure to evaluate the impact of such research publications.

Going back to the clock, one will find that the dictum โ€œpublish or perishโ€ in academia was introduced in 1942 in Logan Wilson’s book, “The Academic Man: A Study in the Sociology of a Profession” – says Eugene Garfield, the founder of Institute for Scientific Information’s (ISI). Then the measurement of journal Impact Factor (IF) was introduced in 1975 by Eugene Garfield as part of the Journal Citation Reports.

Eventually, academics were motivated (read forced) not only to publish more and more papers but also to publish their papers in higher-ranking journals measured by higher IF. Eventually, having a higher number of papers and publishing in the โ€œhighโ€ ranking journals became the requirements in academia for appointment, promotion, and even grant approval.

Now, in less than 100 years, academia is experiencing a new survival dictum – be visible or vanish. Amidst the logical criticism, academic policy makers will continue to impose the new dictum for appointment, promotion, and even approval.

I wonder if the โ€œinventorsโ€ of new knowledge, i.e., academics at universities, know what is next?


Prof Mohammad is the Deputy Executive Director (Development, Research & Innovation) at International Institute of Public Policy and Management (INPUMA), Universiti Malaya, and can be reached at tarique@um.edu.my

Previous article
Next article