ABSTRACT
Governments are increasingly pushing researchers to engage in activities with societal impact, emphasizing the need for research dissemination and engagement with the broader public. This study addresses this imperative by investigating the multifaceted factors that influence social media attention, particularly on Twitter, for scientific research. Using Altmetric data and employing multiple linear regression analysis, this paper explores the determinants of Twitter mentions for research outputs. The study shows that certain factors have a significant impact on the level of engagement. In particular, the presence of research in mainstream news emerges as the most influential factor, highlighting the power of media coverage in increasing research visibility. In addition, research topics that align with highly topical issues, such as the COVID-19 pandemic, also garner significant attention on Twitter. Conversely, the influence of expert recommendations and the consolidation of knowledge in the form of review articles have a relatively weaker impact on Twitter mentions. In addition, this study underscores that public policy references in reports and citations within Wikipedia have limited influence in driving social media attention. Interestingly, mentions in patent applications do not have a significant impact in this context. In conclusion, this study provides valuable insights into the dynamics of research dissemination in the digital age and sheds light on the nuanced factors that can enhance or diminish its societal impact on Twitter.
INTRODUCTION
Governments increasingly push researchers toward activities with societal impact, including economics, cultural and health benefits.[1] That is why since the term âaltmetricsâ was introduced in 2010,[2] theoretical and practical research have been conducted in this discipline.[3] Thus, altmetrics allow knowing how the results of scientific research are perceived and commented beyond academia.[4]
Twitter is nowadays the most used social media platform by the general population (and by researchers in particular) to disseminate and comment on the results of scientific research. There are many and varied factors that may influence social media attention of research. According to the literature review below, these factors include the mainstream news coverage, the topic addressed (COVID-19, for example), and some characteristics of the research such as its proximity to social issues (impact on public policy) and business (impact on patents), their contribution to the consolidation of knowledge (in the form of review or mention on Wikipedia), and the recommendation of experts (Faculty Opinions), among other aspects.
In this paper, the effect of these factors on the social attention that research receives through mentions on Twitter is quantified. For this, data on mentions in Altmetric and a multiple linear regression analysis are used. The unit of study is the research paper (article and review) in disciplinary journals in the field of Clinical Medicine. The time frame covers 2018-2020 and the country analyzed is Spain.
The choice of a study group focused on the medical discipline is a direct response to the health crisis caused by COVID-19. Given the gravity of the situation, I chose to study an issue that has significant social impact and has received widespread attention on various social media platforms. Specifically, I examined this issue in the context of a specific geographical area, namely Spain. The decision to focus on Spain was not arbitrary, but rather rooted in the authorâs intimate understanding of the unfolding reality of the pandemic during the period under study.
LITERATURE REVIEW
Most altmetric data improve citations regarding the accumulation speed after publication.[5] However, except for Mendeley readership which is moderately correlated with citations,[6] there is a negligible or weak correlation between citations and most altmetric indicators.[7] This means that altmetrics might capture diverse forms of impact which are different from citation impact.[8]
Altmetrics come to cover the need for researchers to provide evidence of the societal impact of their results. However, it is difficult to measure the societal impact of research because a long time can elapse between basic research and its practical applications,[9] and because the obsolescence of the results strongly determines the impact of research.[10] Thus, in addition to mentions in social media and mainstream news, altmetrics also include references in public policy documents and recommendations more scholarly than societal.[11] Furthermore, the variety of indicators and their differences advise using them separately instead of mixed indicators.[8]
The attention received by research and its impact are not synonymous. Social attention is a more complex phenomenon because it can be motivated by positive or negative aspects of the research.[3] Among the different dimensions of social attention, the following can be mentioned. Mentions on Twitter and Facebook can represent discussion on social media, blogs and news might reflect attention about newsworthiness, and Wikipedia might explain informational attention.[12] Moreover, there are different levels of attention depending on the commitment that the social interaction entails. In this way, it is not the same to retweet than to write a post on a blog.[11]
Authors found a higher presence of altmetrics in social sciences and humanities than in the natural sciences.[13] Thus, they suggested that altmetrics can represent a complement to citations, especially in humanities and social sciences. Authors have also been interested in the platforms that collect altmetrics, such as Altmetric, Impactstory, and PlumX, in relation to the data source, the indicators provided, the speed of data accumulation, and so on.[5,14,15] The differences between countries and disciplines according to the coverage of the mentions have also been analyzed.[16]
As indicated, there are many and varied factors that influence social attention of research. In addition to those already mentioned, Faculty Opinions (formerly F1000Prime) is a system for post-publication peer-review, in which experts identify, assess, and comment relevant papers they read.[17]
METHODOLOGY
Disciplinary journals correspond to those in the ESI field of Clinical Medicine in Web of Science database. To identify the scientific production of Spain, all research articles and reviews in Web of Science which at least one co-author had affiliation located in Spain were considered. For it, the search field âAddressâ was used (AD = Spain), and the query was limited to the Science Citation Index Expanded (SCI-Expanded). The search was not limited by the language of communication, so there may be some posts in languages other than English. Web of Science was also used to identify those papers that include the term COVID-19 and/or SARS-CoV-2 in the title, and the document typology.
The source of altmetrics data is Altmetric. This is a currently popular and one of the first altmetrics aggregator platforms, that originated in 2011 with the support of Digital Science. It captures the online presence and analyzes the conversations around the research, tracking and accumulating mentions of scientific articles from various social media platforms, news, blogs, and other sources.[18] The altmetrics data was identified by the documentâs DOI.
The Web of Science database provided the DOIs of the Spanish research production in Clinical Medicine in 2018-2020, a total of 47,211 research papers of which 29,894 were in the Altmetric platform with some social mention (60%). A total of 610 papers of those indexed in Altmetric contained the terms COVID-19 and/or SARS-CoV-2 in the title. Data were retrieved in December 2021.
The sample employed in this study is described in Table 1. I decided to add the Covid-19 papers to a simple random sample of the other papers. In this way, the years 2018 and 2019 correspond to a simple random sample, while the year 2020 is a compendium of a random sample and the Covid-19 papers. The COVID-19 papers are therefore overrepresented in the sample, 12.5% versus 2% in the population. This decision is motivated by the fact that, except for news and mentions on Twitter, the rest of the variables analyzed are very infrequent (see Table 3) and, therefore, taking only 2% of COVID-19 publications means that some variables are mostly equal to zero within the COVID-19 group and, therefore, would not explain anything in the proposed model. Thus, the resulting sample size was N= 4895.
Papers | Group | Year | ||||
---|---|---|---|---|---|---|
2018 | 2019 | 2020 | 2018-2020 (% of Total) | |||
Population | 29,894 | COVID-191 | 0 | 0 | 610 | 610 (2.0%) |
Others | 9064 | 9894 | 10,326 | 29,284 (98.0%) | ||
Total | 9064 | 9894 | 10,936 | 29,894 | ||
Sample | N=4895 (16.4%) | COVID-191 | 0 | 0 | 610 | 610 (12.5%) |
Others | 1305 | 1506 | 1474 | 4285 (87.5%) | ||
Total (% of Population) | 1305 (14.4%) | 1506 (15.2%) | 2084 (19.1%) | 4895 (16.4%) |
The methodology in this paper consists of a Multiple Linear Regression analysis. So, the dependent variable is the social media attention of a research measured through the number of tweets and retweets in Twitter, or Twitter mentions for short. The independent variables are described in Table 2.
Name | Variable | Description | Source | Type |
---|---|---|---|---|
1. Twitter | Social media attention | Number of mentions in the social media Twitter (tweets and retweets including the DOI of the paper). | Altmetric.com | Natural number N = {0, 1, 2, …} |
2. News | Mainstream news | Number of news in the mainstream media. | Altmetric.com | Natural number N = {0, 1, 2, …} |
3. Patent | Patent application mentions | Number of mentions in patent applications. | Altmetric.com | Natural number N = {0, 1, 2, …} |
4. Policy | Public policy mentions | Number of mentions in public policy documents. | Altmetric.com | Natural number N = {0, 1, 2, …} |
5. Wikipedia | Wikipedia mentions | Number of mentions in Wikipedia articles. | Altmetric.com | Natural number N = {0, 1, 2, …} |
6. COVID-19 | COVID-19 in the title | Inclusion of COVID-19 or SARS-CoV-2 in the paper title. | Web of Science | Dichotomous {yes = 1, not = 0} |
7. F1000 | Expert mentions | Number of recommendations on Faculty Opinions (formerly F1000Prime). | Altmetric.com | Natural number N = {0, 1, 2, …} |
8. Review1 | Type of research | Typology of the paper. | Web of Science | Dichotomous {review = 1, article = 0} |
Some considerations can be made regarding the choice of independent variables. Other possible altmetric indicators and bibliometric variables could be considered. The criterion when selecting the independent variables of the model was to incorporate different dimensions. Mentions on Facebook are much less frequent than on Twitter, and, unlike other studies, I chose to use a single indicator instead of an aggregation of measures with arbitrary weights. Something similar happens with blogs and news. Both correspond to the same dimension. The correlation between both variables is high, so it was decided to include only the news instead of an aggregation with arbitrary weights.
Regarding the bibliometric indicators, a current case of study (COVID-19) well represented in the field analyzed (Clinical Medicine) was used. Although open access might be of interest, during the pandemic publishers gave open access to all publications on COVID-19, so it was not finally included in the model. About the review documentary typology, keep in mind that a systematical review of the literature puts research into context for news reporters, policymakers, the public, and other researchers. In this sense, it seems relevant a priori.
Finally, scientific collaboration is an interesting aspect because it increases both citations and mentions. However, this aspect would require a more detailed study in relation to the type of collaboration (number of co-authors, number of affiliations, number of countries, etc.)
RESULTS
The objective of the study is to know if and how social media attention of a scientific research, taking as proxy the number of mentions in Twitter, can be explained from other social mentions and bibliometric characteristics of the paper.
So, the dependent variable is the number of mentions in Twitter. The independent variables are the following: mainstream news; references in patent applications, public policy documents, and Wikipedia articles; expert mentions (taking as proxy the number of recommendations in Faculty Opinions -formerly F1000Prime-); the topic through the inclusion of COVID-19 or SARS-CoV-2 in the title; and the publication typology (article and review). The description of the variables is shown in Table 2.
I checked the distributions in the histograms are reasonable for all variables (see Table 3 for the mean and standard deviation, among other descriptive measures). Note there are N = 4895 independent observations in our dataset. I checked also for curvilinear relations or anything unusual in the plot of the dependent variable versus each independent variable.
Variable | Mean | SD | Maximum | Sum | Count |
---|---|---|---|---|---|
1. Twitter | 47.117 | 463.129 | 15,695 | 230,637 | 4895 |
2. News | 2.298 | 27.489 | 1429 | 11,247 | 4895 |
3. Patent | 0.016 | 0.186 | 6 | 80 | 4895 |
4. Policy | 0.040 | 0.414 | 20 | 194 | 4895 |
5. Wikipedia | 0.025 | 0.226 | 6 | 120 | 4895 |
6. Covid-191 | 0.125 | 0.330 | 1 | 610 | 4895 |
7. F1000 | 0.021 | 0.174 | 4 | 103 | 4895 |
8. Review2 | 0.146 | 0.353 | 1 | 716 | 4895 |
The Spearman correlations are shown in Table 4. Note the independent variables have a statistically significant relation with Twitter mentions, but small in many cases. Twitter mentions correlate mainly with news (0.30) and COVID-19 (0.19). To a lesser extent they also correlate with F1000 (0.14), policy (0.11), and Wikipedia (0.10).
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | |
---|---|---|---|---|---|---|---|---|
1. Twitter | 0.30** | 0.03* | 0.11** | 0.10** | 0.19** | 0.14** | 0.03* | |
2. News | 0.11** | 0.15** | 0.15** | 0.14** | 0.19** | -0.03 | ||
3. Patent | 0.05** | 0.06** | -0.04** | 0.12** | 0.00 | |||
4. Policy | 0.13** | 0.13** | 0.13** | 0.00 | ||||
5. Wikipedia | 0.04** | 0.12** | 0.05** | |||||
6. COVID-191 | 0.01 | 0.02 | ||||||
7. F1000 | -0.03* | |||||||
8. Review2 |
Therefore, the multiple linear regression model could estimate the Twitter mentions from all independent variables simultaneously. For it, I checked the correlations among the independent variables (Table 4). Note all the absolute correlations are low (none of them exceed 0.19) and multicollinearity problems are discarded for the actual regression analysis.
Note the only significant negative correlations are observed between COVID-19 and patent (-0.04), and between review and F1000 (-0.03). Although very low, they are statistically different from zero. This means that articles on the Covid-19 topic are less referenced in patent applications and that review articles receive fewer expert recommendations. In the first case, it may be because all the COVID-19 articles correspond to the last year of the period analyzed and have had less time to be incorporated into patent applications.
According to the b-coefficients in Table 5, the regression model is:
where Twitter_i denotes predicted Twitter mentions for paper i, i = 1,2, âŚ, 4895.
Variable | B (Coeff.) | 95% CI | β (Standardized Coeff.) | t | p (Sig.) |
---|---|---|---|---|---|
Constant | 0.721 | [0.701, 0.741] | 0.000 | 71.793 | 0.000 |
Log10_News | 0.824 | [0.761, 0.887] | 0.366 | 25.636 | 0.000 |
Log10_Patent | -0.203 | [-0.917, 0.510] | -0.007 | -0.559 | 0.576 |
Log10_Policy | 0.534 | [0.111, 0.958] | 0.034 | 2.473 | 0.013 |
Log10_Wikipedia | 0.747 | [0.141, 1.353] | 0.033 | 2.418 | 0.016 |
Covid-191 | 0.332 | [0.280, 0.384] | 0.162 | 12.604 | 0.000 |
F1000 | 0.235 | [0.131, 0.339] | 0.061 | 4.435 | 0.000 |
Review2 | 0.099 | [0.051, 0.146] | 0.051 | 4.051 | 0.000 |
The adjusted R-square is reported in Table 5. R-square is the proportion of variance in the dependent variable accounted by the model. In our model R2adj = 0.213, which is considered acceptable in social sciences for a model that does not pretend to predict but to explain a social phenomenon. Moreover, since the p-value found in the ANOVA is less than 10-4, the entire regression model has a non-zero correlation.
Note each b-coefficient in equation (1) indicates the average increase in Twitter mentions (in a base-10 logarithmic scale) associated with an increase of ten units in those predictors in logarithmic scale or associated with a unit increment in the other predictors, everything else equal.
Thus, ten additional mainstream news are associated with a 6.67 (potency in base ten of 0.824) average increase in the number of mentions in Twitter, everything else equal. Similarly, ten additional policy mentions increase Twitter mentions on average by 3.42 (i.e., 100.534). Moreover, ten additional Wikipedia mentions are associated with a 5.59 (i.e., 100.747) average increase in the number of mentions in Twitter, ceteris paribus. Note ten additional patent mentions contribute an average 1.60 (i.e., 100.203) decrease in Twitter mentions. However, this coefficient was not statistically significant. Analogously, each expert mention F1000 is associated with a 1.72 (i.e., 100.235) average increase in the number of mentions in Twitter (72% higher for each recommendation), everything else equal.
About the dichotomous variables, a 1-unit increase in COVID-19 results in an average 2.15 (i.e., 100.332) mentions increase in Twitter. Note that COVID-19 is coded 0 (not) and 1 (yes) in our dataset. So, for this variable, the only possible 1-unit increase is from not COVID-19 to COVID-19. Therefore, the average Twitter mentions for COVID-19 papers is 2.15 times higher than for not COVID-19 papers (more than double), everything else equal. Similarly, the average Twitter mentions for a review paper is 1.25 (i.e., 100.099) times higher than for research papers (25% higher), ceteris paribus.
The statistical significance column (Sig. in Table 5) contains the 2-tailed p-value for each b-coefficient. Note that most b-coefficients in the model are highly statistically significant with a p-value less than 10-4. However, mentions in patent applications does not have a significant influence.
Note the b-coefficients donât indicate the relative strengths of predictors. This is because independent variables have different scales. The standardized regression coefficients or beta coefficients, denoted as β in Table 5, are obtained by standardizing all regression variables before computing the coefficients, and therefore they are comparable within and between regression models.
Thus, the two strongest predictors in the coefficients are the news (β = 0.366) and the topic Covid-19 (β = 0.162). This means that the number of news is the factor, among those analyzed, that contributes the most to the social media attention of a research (Twitter mentions), approximately 2.3 times more than a very media topic as Covid-19. Moreover, the number of news contributes to Twitter mentions 6 times more than expert mentions F1000 (β = 0.061), 7.2 times more than the review typology (β = 0.051), 10.8 times more than mentions in policy documents (β = 0.034), and 11.1 times more than mentions in Wikipedia (β = 0.033).
About the multiple regression assumptions, each observation corresponds to a different paper. Thus, I consider them as independent observations. The regression residuals are approximately normally distributed in the histogram. I also checked the homoscedasticity and the linearity assumptions in a plot of residuals versus predicted values. This scatterplot does not show any systematic pattern and therefore both assumptions hold.
Therefore, as a main conclusion, among the dimensions analyzed in this paper, the factor that contributes most to the attention in social media (Twitter mentions) is the number of news, followed by the topic Covid-19 (44% compared to news for documents with similar characteristics). Other factors that also positively influence social attention in Twitter, albeit to a lesser extent, are expert mentions F1000 (17% compared to news), the review typology (14% compared to news), and mentions in policy documents and Wikipedia (9% compared to news). All these relationships are averaged and assuming a comparison of similar documents (ceteris paribus). Finally, for mentions in patent applications, no evidence was found for their association with social attention in Twitter compared to similar documents.
DISCUSSION
This work tries to explain the social media attention of a scientific research through some other social mentions and bibliometric characteristics of the paper. For this, publications in disciplinary journals of Clinical Medicine were used. Twitter is a social media platform with the potential to help scientists disseminate health-related research for policy impact.[19] In this field it is common for some research to be mentioned in public policy reports. In this respect, I obtained significant evidence that the typology of paper with potential application in public policies is a minor factor that contributes to the social media attention of a research (Twitter mentions). This typology is characterized by a high immediacy application to social problems, a rapid incorporation to knowledge and a rapid aging.[20]
Among the characteristics of the research, I have included another typology of the paper, distinguishing between research article and review. A review paper is a highly valuable type of research output because it puts research into context for news reporters, policymakers, the public, and other researchers.[21] However, in the analyzed dataset, I have found evidence of its weak association with social attention in Twitter when compared with similar documents.
There is significant evidence that the factor that contributes the most to the social media attention of a research (Twitter mentions) is the number of mainstream news. Thus, an additional mainstream news mention is associated with 0.67 increase in the number of mentions in Twitter, everything else equal. Moreover, an additional mention in Wikipedia is associated with 0.56 increase in Twitter mentions, while a reference in public policy documents is associated with 0.34 increase in the number of mentions in Twitter, ceteris paribus.
Interestingly, the average number of Twitter mentions for Covid-19 papers is 2.15 times higher than for not Covid-19 papers (more than double), everything else equal. However, in relative terms, the Covid-19 topic contributes to social attention on Twitter a 44% in relation to the number of news when comparing documents with similar characteristics.
Furthermore, the average number of Twitter mentions for a review paper is 1.25 times higher than for a research paper (25% higher), while for a paper mentioned by an expert is 1.72 times higher than for those not mentioned (72% higher for each recommendation), everything else equal.
About mentions in patent applications, I have not found evidence of its association with social attention in Twitter when compared with similar documents. A possible explanation for this result could be the following. This variable is more related to academic citations than to social media attention. That is, the innovation and business transfer, although being very relevant, it is difficult to communicate through social media. In this way, the most cited research is not necessarily the one that receives the most social attention.[4] Thus, citation and social attention do not correlate with each other and, therefore, they measure different dimensions in the impact of research results. This is something that has been pointed out in the literature,[7,22] and means that altmetrics might capture diverse forms of impact which are different from citation impact.[8]
In this paper, I have focused exclusively on articles and reviews. It is worth noting that many COVID-19 papers published during the pandemic took the form of short communications, including letters, notes, and editorials. As a result, our study may have missed a significant number of related papers. Nevertheless, articles and reviews remain the primary means of communicating scientific research. Expanding our scope to include other typologies could introduce complexity into the analysis of the relationships among the variables considered. Therefore, we deliberately limited our study to the typologies we examined.
As a final consideration, altmetrics data have the advantage of measuring different types of impacts beyond academic citations. [18] They also have the potential to capture earlier impact evidence. This is useful in self-evaluations. Nevertheless, social attention of research must be used cautiously because it could provide a partial and biased view of all types of societal impact. For this reason, it should be avoided when evaluating researchers, especially in recruitment processes and internal promotions. In this work, social mentions were used to study the phenomenon of social attention of research itself and not to evaluate the researchers.
CONCLUSION
In response to governmentsâ increasing emphasis on researchersâ engagement in activities with societal impact, this study explored the intricacies of the factors influencing social media attention, particularly on Twitter, for scientific research.
Using Altmetric data and multiple linear regression analysis, our research uncovered the key determinants of Twitter mentions for research outputs. Significantly, research presence in mainstream news emerged as the most powerful influencer, underscoring the central role of media coverage in increasing research visibility. In addition, research topics related to current, highly topical issues, such as the Covid-19 pandemic, attracted significant attention on Twitter.
Conversely, the impact of expert recommendations and the consolidation of knowledge through review articles had a comparatively weaker influence on Twitter mentions. In addition, this study highlights the limited impact of policy references in reports and citations in Wikipedia in driving social media attention. Interestingly, mentions in patent applications did not have a significant impact in this context.
In conclusion, this research provides valuable insights into the ever-evolving landscape of research dissemination in the digital age. It highlights the nuanced factors that either enhance or diminish the societal impact of scientific research on Twitter, thus providing guidance for researchers and institutions seeking to navigate the complex terrain of public engagement and knowledge dissemination.
The choice of a study group focused on the medical discipline is a direct response to the health crisis caused by COVID-19. Given the gravity of the situation, I have chosen to investigate a topic that has significant social impact and has received widespread attention on various social media platforms. Specifically, I chose to examine this issue in the context of a specific geographical area, namely Spain. The decision to focus on Spain was not arbitrary, but rather rooted in the authorâs intimate understanding of the unfolding reality of the pandemic during the period under study. Although this study is limited to the field of medicine in the Spanish context, there is potential for a broader extrapolation of the findings. The underlying mechanisms that drive social media mentions, such as content virality, user engagement, and societal impact, may indeed have universal aspects. However, it is important to recognize that specific factors may vary significantly across domains and regions.
In order to gain a comprehensive understanding of territorial relationships in the factors influencing social media mentions, it would be prudent to conduct similar studies in different contexts. Comparative analyses across territories and disciplines could provide valuable insights into the generalizability of findings and help identify commonalities as well as unique regional or sectoral patterns. Such research efforts would contribute to a more nuanced understanding of the complex interplay between social media dynamics and various external factors.
The results of this research are highly relevant to higher education institutions, providing them with valuable insights to improve their knowledge dissemination strategies. In todayâs rapidly evolving academic landscape, the importance of altmetrics and the effective dissemination of knowledge cannot be overstated, as they serve as essential metrics for measuring the societal impact of educational institutions, particularly universities.
In this context, universities are under increasing pressure not only to produce groundbreaking research, but also to ensure that their knowledge reaches a wider audience and contributes to society. The findings of this study can provide universities with a roadmap for optimizing their knowledge dissemination policies, enabling them to realize the full potential of their research output.
Moreover, the implications of this research extend beyond academia. Government agencies charged with evaluating the performance of higher education institutions play a critical role in shaping the educational landscape. By taking into account the findings of this study, these government agencies can make informed decisions when designing performance agreements for universities. This, in turn, can lead to more effective policies that promote knowledge dissemination, innovation, and societal engagement within the higher education sector.
In summary, the research findings not only serve as a guide for higher education institutions to refine their knowledge diffusion policies, but also provide a valuable resource for government agencies seeking to promote excellence and social impact in higher education.
Cite this article:
Dorta-GonzĂĄlez P. Factors that Influence How Scientific Articles and Reviews are Mentioned on Twitter. J Scientometric Res.
2023;12(3):577-84.
References
- Thelwall M. Measuring societal impacts of research with altmetrics? Common problems and mistakes. J Econ Surv. 2021;35:1302-14. [CrossRef] | [Google Scholar]
- Priem J, Taraborelli D, Growth P, Neylon C. [accessed on 20 June 2023];Altmetrics: A Manifesto. 2010 Available from: http://altmetrics.org/manifesto/
[CrossRef] | [Google Scholar] - Sugimoto CR, Work S, Larivière V, Haustein S. Scholarly use of social media and altmetrics: A review of the literature. J Assoc Inf Sci Technol. 2017;68:2037-62. [CrossRef] | [Google Scholar]
- Dorta-GonzĂĄlez P, Dorta-GonzĂĄlez MI. Collaboration effect by co-authorship on academic citation and social attention of research. Mathematics. 2022;10:2082 [CrossRef] | [Google Scholar]
- Fang Z, Costas R. Studying the accumulation velocity of altmetric data tracked by Altmetric. com. Scientometrics. 2020;123:1077-101. [CrossRef] | [Google Scholar]
- Zahedi Z, Haustein S. On the relationships between bibliographic characteristics of scientific documents and citation and Mendeley readership counts: A large-scale analysis of Web of Science publications. J Informetr. 2018;12:191-202. [CrossRef] | [Google Scholar]
- Bornmann L. Alternative metrics in scientometrics: A meta-analysis of research into three altmetrics. Scientometrics. 2015;103:1123-1144. [CrossRef] | [Google Scholar]
- . Springer Handbook of Science and Technology Indicators. 2019:687-713. [CrossRef] | [Google Scholar]
- Godin B. The linear model of innovation: Maurice Holland and the research cycle. Soc Sci Inf. 2011;50:569-81. [CrossRef] | [Google Scholar]
- Dorta-GonzĂĄlez P, GĂłmez-DĂŠniz E. Modeling the obsolescence of research literature in disciplinary journals through the age of their cited references. Scientometrics. 2022;127:2901-31. [CrossRef] | [Google Scholar]
- . Theories of informetrics and scholarly communication. 2016:372-406. [CrossRef] | [Google Scholar]
- Thelwall M, Nevill T. Could scientists use Altmetric. com scores to predict longer term citation counts? J Informetr. 2018;12:237-48. [CrossRef] | [Google Scholar]
- Chen K, Tang M, Wang C, Hsiang J. Exploring alternative metrics of scholarly performance in the social sciences and humanities in Taiwan. Scientometrics. 2015;102:97-112. [CrossRef] | [Google Scholar]
- Fang Z, Costas R, Tian W, Wang X, Wouters P. An extensive analysis of the presence of altmetric data for Web of Science publications across subject fields and research topics. Scientometrics. 2020;124:2519-49. [CrossRef] | [Google Scholar]
- Ortega JL. Reliability and accuracy of altmetric providers: A comparison among Altmetric. com, PlumX and Crossref event data. Scientometrics. 2018;116:2123-38. [CrossRef] | [Google Scholar]
- Torres-Salinas D, Robinson-GarcĂa N, Arroyo-Machado W. Coverage and distribution of altmetric mentions in Spain: a cross-country comparison in 22 research fields. Prof Inf. 2022;31:2 [CrossRef] | [Google Scholar]
- Bornmann L, Haunschild R. Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data. PLoS One. 2018;13 [CrossRef] | [Google Scholar]
- Altmetric. Attention sources tracked by Altmetric. 2020 [accessed on 20 June 2023]. Available from: https://help.altmetric.com/support/solutions/articles/6000235983-attentio n-sources-tracked-by-altmetric
- Kapp JM, Hensel B, Schnoring KT. Is Twitter a forum for disseminating research to health policy makers?. Ann Epidemiol. 2015;25:883-7. [CrossRef] | [Google Scholar]
- Rowlands I. Patterns of scholarly communication in information policy: A bibliometric study. Libri. 1999;49:59-70. [CrossRef] | [Google Scholar]
- Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J. 2009;26:91-108. [CrossRef] | [Google Scholar]
- Costas R, Zahedi Z, Wouters P. Do âaltmetricsâ correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. J Assoc Inf Sci Technol. 2015;66:2003-19. [CrossRef] | [Google Scholar]