The growth of internet and social media is matched only by few others in human history. It is hence not surprising to witness their influence on the research metrics. Several… Click to show full abstract
The growth of internet and social media is matched only by few others in human history. It is hence not surprising to witness their influence on the research metrics. Several traditional metrics have been used at the level of the journal and authors. The alternative metrics that are web and social media-based, commonly referred to as ‘Altmetrics’, have garnered significant attention from the research communities globally. Altmetrics essentially captures the interaction between the research work and the people by considering the traditional citation metrics and several other social media (Facebook, Twitter, Google+, Youtube) and mainstream traditional media outlets. These include news, blogs, Wikipedia, policy documents, syllabi, and reference managers like Mendeley. Altmetrics are, therefore, attention metrics that aim to measure the attention a research work is getting. An altmetric or attention score is calculated based on the volume of mentions, source importance, and authoritativeness or expertise of the authors. The types of engagements for a particular research work captured include accessing and online viewing, downloading or saving, discussions, recommendations, and citations. The altmetrics are visually presented as a colourful ‘Donut’, each stripe of the colour representing a different platform where the work was mentioned (Figure 1). The volume of mentions from each source is represented in the summary counts. PlumX is an aggregator that takes altmetrics a step forward and offers more metrics in one place, including the traditional citation metrics. It tracks more than twenty types of research outputs. Better information is provided by categorizing the metrics into five meaningful categories; citations (traditional citation data), usage (downloads or views), mentions (blogs, Wikipedia), captures (bookmarks, favourites), and social media (shares, likes, or tweets). The metrics are represented by ‘Plum print’, which has a stem and five coloured circles representing each category (Figure 2). The size of the circle represents the magnitude of the specific metrics that it depicts. More details can be obtained on moving the cursor over each of the circles (Figures 2 and Figure 3). Another use of the altmetric data is the comparison of different institutions using PlumX Benchmarks. This kind of altmetric data can also help a researcher complete the National Institute of Health biosketch and provide some help with funding opportunities. The PlumX metrics hence represent a broader shift in the utilization of the altmetric data and likely to provide a better perspective of the overall research impact. There are several advantages of using altmetrics. The foremost among them is the speed at which the information gets collected (compared to the slow citation metrics) to provide an overview of the prompt research impact (how much is a said work being discussed or mentioned). The dissemination of the research work is also quick and much beyond the traditional journals and articles. Since the sources of altmetrics are updated in real-time, one can expect up-to-date information on the attention an article is receiving. Besides tracking the scholarly productivity, it can also reflect the discussions happening around upcoming unpublished research. As with any other metric, altmetric also suffers from numerous limitations. Altmetrics are essentially attention metrics and do not directly reflect the quality of the research work. They can also be easily manipulated from social medial platforms. People would be more interested in popular topics than focused research on less popular topics. Similarly, sensational claims or topics receive more attention than serious academic research. The lack of inclusiveness of all digital media platforms is compounded by the heterogeneity disadvantages of those included. The diversity of sources feeding into the data makes comparisons difficult. The nature of the attention (praise or criticisms) is not captured. The missing conceptual frameworks, social media involvement of the researchers, and the language bias towards English are other limitations. The correlation of the altmetrics with the traditional metrics is controversial and debatable, with arguments on both sides of the divide. Altmetrics are still evolving, and going forward, they would need further standardization and refinement for it to be a better measure. As William Edwards Deming, the American statistician, once said, ‘In God we trust, all other must bring data’. Altmetrics are here to stay. However, evaluating the altmetrics alone would be an incorrect measure of a research assessment. A combination of the traditional citation metrics and the altmetrics gives a better measure of a particular research work’s global performance and its individual research impact.
               
Click one of the above tabs to view related content.