ABSTRACT
Usability is a qualitative characteristic that evaluates the ease of use of user interfaces. This study aims to conduct a systematic bibliometric analysis of usability testing and to understand the research context and trends in this field. A total of 5273 scientific publications from the Web of Science core collection were included in the study. Performance analysis, scientific mapping, and visualization were done using the RStudio package and the VOSviewer software tool. The results show that the interest in the area of usability testing has significantly increased, especially from 1991 to 2022. The United States has the highest number of publications, citations, co-citations, and ratios. Toronto University was top in terms of institutional contributions. The JMIR mHealth and uHealth led in the number of publications and citations. Khajouei has the highest number of publications, but Jaspers has received the most citations on usability testing. With 10264 total link strength, Nielsen has the most potent co-citation papers. This study reveals the latest research trends and hotspots and the current state of international collaboration in usability testing research, to indicate the most influential research channels. These findings include; the prominent countries, institutions, journals, original articles, and authors. To the best of the authorâs knowledge, this study is the first of its kind to conduct the bibliometric analysis on usability testing. These findings can be useful in shaping the direction of future studies on usability testing, and the understanding how usability testing.
INTRODUCTION
The capabilities of Health Information Technology (HIT) tools are expanding quickly.[1] Numerous potential advantages for healthcare could result from HIT adoption.[2] While the integration of technology in medicine has brought significant improvements in care quality and efficiency, it also introduces potential risks and negative consequences for clinical safety and quality, including the possibility of unforeseen errors that could seriously harm patients.[3–5] Efforts must be made to achieve the benefits of health information technology and avoid its negative consequences.[5] Studies emphasize the critical role of human factors and human-centered design in ensuring well-designed health information technology systems that align with clinical workflows and patient compliance. Developing and implementing human-centered design methods within existing information technology infrastructures can deliver value to patients and physicians through the creation of user-friendly technologies.[6] It is essential to Focusing on the usability of HIT is essential, considering users, tasks, and the environment, to ensure the optimal utilization of health information technology.[7] Usability encompasses various parameters that define the quality of a system and is a critical dimension of quality assessment.[8] The primary objective of usability is to make user interfaces clear and intuitive, enabling users to complete their tasks efficiently. It is regarded as an essential aspect of quality assessment.[9,10] Bevan et al., citing ISO-9241-11, define âusability in terms of effectiveness, efficiency, and satisfaction in a particular context of use.â[11] Usability testing is a technique employed to identify specific usability issues with products and enhance their usability.[12] Research indicates that usability can impact user satisfaction, adoption rates of HIT, healthcare quality, effectiveness, efficiency, professional clinical decision-making, and the rate of medical errors.[10,13–16] Therefore, addressing usability issues in health information systems is crucial to prevent unintended adverse effects.[17] Consequently, usability testing is an established practice in HIT development.[6]
Despite the substantial volume of recent research on usability testing, as of our knowledge, no bibliometric analysis on this topic has been published. Several review studies have systematically examined usability testing, usability challenges in the use of medical devices in the home environment,[18] system usability scale benchmarking for digital health apps,[19] usability of robotic and virtual reality devices in neuromotor rehabilitation,[20] usability of web-based applications in advocating consumers on food safety,[21] tools for evaluating the content, efficacy, and usability of mobile health apps,[22] usability of mobile health apps for postoperative care,[23] and effectiveness and usability of digital tools to support dietary self-management of gestational diabetes mellitus.[24] However, bibliometric indicators play a vital role in contemporary scientific reviews.[25] Bibliometricsâ application is growing and will eventually reach all academic fields.[26] These studies have proven their utility in providing a global perspective on research hotspots, long-term trends, and the influence of contributing scholars, journals, and countries/regions. Therefore, bibliometric studies have become standard tools for assessing the quality of scientific work.[27–29] Academics can benefit from well-executed bibliometric studies in various ways, including gaining a comprehensive understanding of their field, identifying research gaps, developing novel study concepts, and establishing their position within the academic community.[30]
For the abovementioned reason, the research objective in this paper is a quantitative bibliometric approach and network visualization to summarize research trends, hotspots, growing issues, and emerging issues in usability testing. To the best of our knowledge, this is the first study to provide a comprehensive review and analysis of the citation network in usability testing literature. Its results could help determine future research directions in this field. Therefore, initially, this study presents a publication and citation trend analysis spanning from 1983 to 2022. Secondly, it provides a global analysis, focusing on countries with notable numbers of articles and citations. Thirdly, it documents the most active higher institutions in this field. Fourthly, it identifies significant journals contributing to this area of research. Fifthly, it acknowledges the authors who have been most prolific in usability testing research, based on their publication and citation counts. The research utilizes bibliometric linkage and co-citation analysis to explore the interconnections among these journals, countries, and authors. Lastly, the study employs keyword co-occurrence analysis to pinpoint the most prevalent terms in usability testing research. Additionally, based on the statistical analysis conducted, we propose a prospective research agenda for further exploration in the field of usability testing.
METHODOLOGY
According to Zupic and Cater, there are five stages in the typical workflow of scientific mapping: Study design; Data collection; Data analysis; Data visualization; and Interpretation.[31]
A Bibliometric analysis study was designed. The Web of Science (WoS) Core Collection, a database from Clarivate Analytics, was used to search for relevant publications. WoS is a widely used resource for bibliometric studies due to the superior quality of its bibliometric data compared to that of competing databases. WoS has a lower rate of duplicate records and more extensive coverage of high-impact journals.[32] The WoS Core Collection is a reliable collection of scholarly journals, books, and conference proceedings, that contains more than 21,100 peer-reviewed, high-quality scholarly journals published worldwide (including Open Access journals) in more than 250 subject areas in the sciences, social sciences, and arts & humanities. It is the top source on the WoS platform and the first global index of citations for scholarly and scientific research. This Collection is one of the worldâs leading citation databases. The database contains records of articles published in the highest impact journals worldwide, including open access journals, conference proceedings, and books. There is coverage of some titles dating back to 1900.[33–35] Therefore, a comprehensive search was conducted in the WoS on July 25, 2022. Title, abstract, author keywords, and keywords plus searches were conducted in the âTopicâ category, covering the three predetermined research areas. searches were not limited by publication type, date or language.
The search strategy was ((((TS=(âUser-Centered Designâ)) OR TS=(âUsability Testingâ)) OR TS=(âUsability evaluationâ)) OR TS=(âUsability experimentâ)) OR TS=(âUsability studyâ). The acquired data were exported to Mendeley Desktop version 1.19.8, to identify and remove duplicates. With the earliest article dating back to 1983, our initial search turned up 11,415 records. We chose all works published between 1983 and 2021 for the bibliography because the publication of 2022 articles were not yet complete. The remaining works were then reduced by 11053. The PRISMA checklist was used to screen the results. Only documents classified as articles were chosen for qualitative reasons because they were probably subject to a thorough review before publication.[36] As a result, there were only 5288 documents left after all were gone. Also, 5273 records were left after removing duplicates. All articles that were published in WoS before 2022 were entered into bibliometric analysis, and excluded results such as book chapters, book editorials, and anthologies. The relevance of the remaining articles was determined by evaluating their titles, and abstracts. Two reviewers independently extracted the necessary data and independently evaluated the study eligibility. In cases of doubt regarding study eligibility, a third reviewer was consulted, and a decision was made based on consensus. Because WoS does not support bibliometric analysis based on addresses or citations, performance analysis and citation network analysis were carried out using the RStudio package and the VOSviewer software for Windows, version 1.6.18 (https://www.vosviewer.co m). VOSviewer is a software application for seeing and navigating network-based maps.[37] RStudio supplies complete tools to analyze and visualize quantitative survey data.[38]
RESULTS
Bibliometric analysis was conducted on the final sample. A general review of the findings showed that 5273 documents were published in 1666 journals between 1983 and 2021. In total, 146832 references have been used to write these documents. The annual growth rate is 11.3%, and the average number of citations per document was 12.92%. Another dataset overview shows that the authors used 11044 keywords to categorize their studies. In addition, the documents had a total of 18871 authors. Additional main information about the usability test data has been presented in Table 1.
Description | Results |
---|---|
Timespan | 1983:2022 |
Sources (Journals, Books, etc.,) | 1666 |
Documents | 5273 |
Annual Growth Rate % | 11.3 |
Document Average Age | 7.58 |
Average citations per doc | 12.92 |
References | 146832 |
Document contents | |
Keywords Plus (ID) | 4519 |
Authorâs Keywords (DE) | 11044 |
AUTHORS | |
Authors | 18871 |
Authors of single-authored docs | 429 |
Authors collaboration | |
Single-authored docs | 475 |
Co-Authors per Doc | 4.43 |
International co-authorships % | 20.59 |
As shown in Figure 1, published articles on usability testing from 1991 have gradually increased, and in recent years, special attention has been paid to it. Also, the number of citations has shown a gradual trend, indicating high academic interest and popularity. According to the journal classification on WoS, the most predominant categories are: 803 Medical Informatics (15.185%), Health Care Sciences Services (13.843%), Computer Science Information Systems (13.162%), Computer Science Cybernetics (11.460%), Ergonomics (10.817%), Information Science LibraryScience (9.020%), Computer ScienceSoftware Engineering (6.902%), Computer Science Interdisciplinary Applications (7%), Public Environmental Occupational Health (6.4%), and Computer Science Theory and Methods (6.1%) (Figure 2).
Performance Analysis
Performance analysis of documents
According to the findings, the study âIBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and Instructions for Useâ has been ranked first in Table 2âs list of the ten studies with the highest amount of citations, outpacing all other studies by a factor of at least two. This publication describes IBMâs most current work on subjective usability measurement. This study examined the psychometric properties of questionnaires created for scenario-based usability testing.[39]
AU | TI | PY | SO | TC |
---|---|---|---|---|
James R. Lewis. | IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. | 1995 | International Journal of Human-Computer Interaction | 999 |
Robert A. Virzi. | Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? | 1992 | Human factors | 497 |
Ăngela Di Serio, MarĂa Blanca Ibåñez, Carlos Delgado Kloos. | Impact of an augmented reality system on studentsâ motivation for a visual art course. | 2013 | Computers & Education | 446 |
Ritu Agarwal, Viswanath Venkatesh. | Assessing a Firmâs Web Presence: A Heuristic Evaluation Procedure for the Measurement of Usability. | 2002 | Information systems research | 444 |
Zhao Huang, MoradnBenyoucef. | From e-commerce to social commerce: A close look at design features. | 2013 | Electronic Commerce Research and Applications | 429 |
Laura Faulkner. | Beyond the five-user assumption: Benefits of increased sample sizes in usability testing. | 2003 | Behavior Research Methods, Instruments, & Computers | 381 |
Joseph A Cafazzo, Author Orcid Image, Mark Casselman, Nathaniel Hamming, Debra K Katzman, Mark R Palmert | Design of an mHealth App for the Self-management of Adolescent Type 1 Diabetes: A Pilot Study. | 2012 | Journal of medical Internet research | 359 |
Roberto Verganti. | Design, Meanings, and Radical Innovation: A Metamodel and a Research Agenda. | 2008 | Journal of product innovation management | 352 |
T. Boren, J. Ramey. | Thinking aloud: reconciling theory and practice. | 2000 | IEEE transactions on professional communication | 298 |
Monique W.M.Jaspers. | A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence. | 2009 | International Journal of Medical Informatics. | 285 |
Performance analysis of authors
For authors, citations are useful.[40] The most active authors were classified according to the h-index. Table 3 reveals the top 10 authors with the most academic publications. Khajouei R, with a total number of 15 publications, Jaspers MWM, and Kubler A, with a total number of 14 publications, have the highest number of publications on usability testing. Nevertheless, in terms of the number of citations, Kubler A. has received the most citations, with 647 cases. While, Jaspers MWM. with 638 cases and an h-index of 12, it has the highest rank among authors.
Element | h_index | g_index | m_index | TC | NP | PY_start |
---|---|---|---|---|---|---|
Jaspers MWM | 12 | 14 | 0.75 | 638 | 14 | 2007 |
Hornbaek K | 10 | 12 | 0.588 | 271 | 12 | 2006 |
Kubler A | 10 | 14 | 1 | 647 | 14 | 2013 |
Schnall R | 10 | 12 | 0.909 | 550 | 12 | 2012 |
Cafazzo JA | 9 | 12 | 0.692 | 577 | 12 | 2010 |
Khajouei R | 9 | 15 | 0.692 | 264 | 15 | 2010 |
Stinson Jn | 9 | 12 | 0.529 | 408 | 12 | 2006 |
Straus Se | 9 | 10 | 0.692 | 236 | 10 | 2010 |
Bates DW | 8 | 13 | 1.143 | 227 | 13 | 2016 |
Hertzum M | 8 | 9 | 0.364 | 366 | 9 | 2001 |
Performance analysis of journals
The best scientific journals are authentic sources of information on recent scientific advances.[41] 1733 journals have published articles from 1983 to 2021. Table 4 reveals the most influential journal lists from 1983 in the research concerning usability testing.
Sources | Articles |
---|---|
JMIR mhealth and uhealth | 125 |
Journal of medical internet research | 117 |
International journal of human-computer interaction | 91 |
Interacting with computers | 89 |
Journal of usability studies | 83 |
International journal of human-computer studies | 76 |
International journal of medical informatics | 74 |
JMIR research protocols | 71 |
Behaviour & information technology | 58 |
Journal of biomedical informatics | 56 |
The most productive journal in usability testing research was the JMIR mHealth and uHealth. JMIR mHealth and uHealth is at the top of this list with 125 studies published JMIR mHealth and uHealth (JMU) focuses on health and biomedical applications in mobile and tablet computing, pervasive and ubiquitous computing, wearable computing, and domotics. JMU is indexed in PubMed, PubMed Central, MEDLINE, and Science Citation Index Expanded (SCIE).[42] After, there is the Journal of Medical Internet Research (JMIR). JMIR is the leading peer-reviewed journal for digital medicine and health and health care in the internet age. JMIR is indexed in more than 18 bibliographic databases and abstracting services.[43] Besides technology-related journals, journals do not excel in productivity; thus, thematic specialization can be a strategic choice.[44]
Figure 3 compares the growth of top publications to 2021. The top 10 journals have published 16% of the articles. Further investigation shows that 1441 publications have published one article each in the field of usability. In other words, 1441 publications have published only 27% of the articles. The findings indicate that only a small percentage of journals are responsible for a high percentage of scientific production in a specific field.
Performance analysis of Authorsâ Affiliations
Performance analysis of Authorsâ Affiliations is a bibliometric method that examines the distribution and impact of authorsâ affiliations in a given research field or topic. It can help to identify the most productive and influential institutions, countries, or regions, as well as the patterns of collaboration and mobility among them. It can also reveal the diversity and interdisciplinary of research groups and their contributions to scientific knowledge.[45] The analysis of author affiliations really affects how research is interpreted.[46] This method can be useful for evaluating the quality and relevance of research outputs, as well as for informing policy and decision-making regarding research funding, support, and development.[45] The top twenty most active institutes in usability testing research are shown in Figure 4 (a). According to the findings, Toronto University is at the top of the list with a huge difference. On the other hand, based on the growth trend of scientific production at scientific institutions, it can be seen that Toronto University has the highest growth rate of scientific production around; on the other hand, this growth at Toronto University has a very steep slope. It is so fast that from 1989 to 2021, from 1 scientific production to 19 scientific publications (Figure 4 (b)). In the 2022, Toronto University is ranked 16th among the universities worldwide and first in Canada by the SCImago Institutional Rankings.[47] It should be noted that this count is based on all the authors of the publications.
Performance analysis of Authorsâ countries
The results of study clearly demonstrate the dominant position of certain countries in the field of usability testing. 101 countries participated in usability testing-related research output. At the same time, 74% of the publications were produced by the top 10 countries. Table 5 and Figure 5 (a) show that the USA, by a large margin over the others, published the most papers (n=5499) and had the highest total citations (28412). Other abundant countries were Canada (n=1458) and the United Kingdom (n=933), respectively. It should be noted that this count is based on all the authors of the publications. About 91% of the studies were conducted in domestic and international collaborations between the authors. The Netherlands had the highest average article citation rate (n=17.08), followed by the USA (n=16.55), Canada, and Italy (n=15.48).
Country | NP | TC | Average Article Citations |
---|---|---|---|
USA | 5499 | 28412 | 16.55 |
Canada | 1458 | 5032 | 15.48 |
United Kingdom | 933 | 4729 | 14.24 |
Netherlands | 594 | 3364 | 17.08 |
Italy | 362 | 2353 | 15.48 |
China | 623 | 2219 | 9.25 |
Germany | 556 | 2129 | 9.55 |
Spain | 407 | 1884 | 11.78 |
Korea | 375 | 1290 | 8.66 |
Australia | 419 | 1278 | 9.54 |
Figure 5 (b) shows the tendency of the corresponding authorâs countries with their number of publications and examination of the number of Single Country Publications (SCP), Multiple Country Publications (MCP), and multiple country publication relatives. According to the figure, most publications are published by single country publications. Also, according to the figure, most of the publications are written by authors from the same country, and therefore most are SCP. The USA was the top country with a total of 1717 publications. Of these, 1523 were single country, and 194 were multiple country publications with an MCP ratio of 0.113, indicating that the majority of usability testing publications in the United States were published in only one country. In the analysis of MCP ratios, Bulgaria, Malawi, Yemen, and Zimbabwe have the highest MCP ratios (MCP ratio=1), with one article being written as an MCP. The MCP ratio (MPC articles/total publications per country) was calculated based on the number of MCPs obtained from inter-country collaborations.[48]
Keyword analysis
In bibliometric studies, authorsâ keywords analysis is an important topic. A hot spot in a discipline can be identified by counting the frequency with which keywords appear.[49] Figure 6 (a) illustrates the most preferred title words used in papers. Among the top 20 keywords, usability occurred in 1960 (19.22%) records, Design in 1119 (10.97%) records, Study in 907 (8.9%) records, and evaluation in 894 (8.76%) records.
The trend of title keywords over time, based on the calculation of word weight, showed that âUsabilityâ had the highest weight from 2009 to 2019. After that, âDesignâ from 2011-2020, âStudyâ from 2014-2020, âEvaluationâ from 2010-2019, and âTestingâ and âUserâ from 2010-2019 have the highest weight. Figure 6 (b) shows the most preferred author keywords used in papers. Among the top 20 keywords, âmobile healthâ occurred in 69 (15.3%) records, âuser-computer interfaceâ in 45 (9.98%) records, and âhuman computer interactionâ in 43 (9.53%) records.
The trend of author keywords over time, based on the calculation of word weight, showed that âuser-centered designâ had the highest weight from 2012 to 2020. After that, âusabilityâ from 2012-2020, âusability testingâ from 2012-2019, and âusability evaluationâ from 2009-2019 have the highest weight. It should be noted that âuser-centered designâ has been introduced as a MeSH term. The Medical Subject Headings (MeSH) are the subject headings that appear in MEDLINE/PubMed, the NLM Catalog, and other NLM databases.[50]
The second category of bibliometric analysis is scientific mapping. Science mapping techniques encompass various methods such as citation analysis, co-citation analysis, bibliographic coupling, co-word analysis, and co-authorship analysis.[30]
Cocitation authors network visualization
Co-citation authors network is a method in scientometrics that involves visualizing the relationships between authors based on their TLS (Total Link Strength) scores. By measuring how often authors cite the same sources, co-citation analysis reveals useful insights into the field of scientometrics.TLS scores enhance this analysis by considering the strength of these citation relationships, providing a nuanced understanding of scholarly collaboration and influence.[51] Figure 7 shows the network visualization map of the cocitation analysis of cited authors with a minimum number of 100 citations. In total, from 92709 authors, only 49 authors met the threshold. The result of the cocitation analysis of cited authors is given in four different clusters. The circle shape observed in different sizes illustrated the number of cocitations. The larger the circle, the more cocitations were specified in the usability field, and the circle of the same color was mentioned as a similar issue among these publications. Jakob Nielsen, with 10264 TLS, has the most potent cocitation documents. Then James Lewis has the most robust article with 3932 total link strength.
Co-authorship countries network visualization
Figure 8 (a) shows ten main clusters of coauthorship based on authorsâ countries. Circle size indicates number of publications, and line thickness indicates cooperation between nations.[52] The minimum number of documents of a country, and the minimum number of citations of a document were fixed at five. In total, from 110 countries, only 69 countries met the threshold. Most countries had a cooperative relationship with the United States. Based on TLS, the top country was the USA (TLS=549). Then there were England (TLS=308), Canada (TLS=223), and Germany (TLS=213). These productive countries have strong collaborative relationships. USA, Finland, Denmark, Sweden, and England are among the oldest, and Saudi Arabia, Pakistan and Indonesia are among the newest countries in the overlay visualization by year (Figure 8 (b)).
Keywords network visualization
In network and cluster analysis, the color corresponds to a specific research cluster, whereas the circle dimension indicates how often the keywords are repeated in the analysis article. The thickness of the line connecting the circles indicates the intensity of the correlation between the keywords.[53] The occurrences map based on authors keywords presented four main recognized clusters. In Figure 9 (a), each cluster is marked with a color. The minimum number of authorsâ keywords occurrences was fixed at 25. Of the total 11046 keywords, only 25 keywords were plotted. Based on TLS, the top five author keywords were usability (TLS=438), user-centered design (TLS=349), usability testing (TLS=183), mhealth (TLS=177), and user experience (TLS=159). Also, the occurrences map based on keywords plus presented three main recognized clusters. The minimum number of keywords plus occurrences was fixed at 25. Of the total 4519 keywords plus, only 48 keywords plus were plotted. The top three keywords plus were care (TLS=363), design (TLS=345), technology (TLS=299), usability (TLS=237), and health and impact (TLS=228). The co-occurrence author keywords network visualization is presented in Figure 9 (b).
DISCUSSION
This research presents the current status of scientific productions and collaboration networks in usability testing. According to the findings, 5273 documents are indexed in WoS, and have been growing since 1983. Bibliometric analysis can be broadly divided into two main categories: performance analysis and science mapping.[54] Both of these analyses were performed in this study. Detailed findings are discussed below.
Overall, the results show the annual growth of usability testing published by this research community. With the development of information technology, the trend of publishing usability testing research is likely to continue to grow. The bibliometric analysis of other scientific fields also indicates the increasing growth of publications. This finding is consistent with the findings of other studies.[55–57]
According to the analysis, Khajouei R has the highest number of publications and Jaspers MWM had the highest number of citations and h_index. In other words, some researchers are even more influential with fewer published articles on usability testing. Performance analysis of authors is assessing an authorâs productivity to help identify the core authors and productive specialists. Performance analysis of authors helps to identify the core authors and productive specialists. This, in turn, can be valuable for collaboration and co-authoring future research.[30]
The United States, Canada, the United Kingdom, and the Netherlands produced the most publications among the 110 countries. The USA, by a large margin over the others, published the most papers. While, the Netherlands had the highest average article citation rate, followed by the USA. In the USA, despite the 36-year timespan from the first publication and the total document counts of 5499, the average number of citations per document appears to be low, probably because the production of robust literature in most producing countries leads to a decrease in average article citations over the years.[48] Publications with the most citations usually have a significant impact on the bibliography of the subject due to their pioneering contributions.
Developing countries have the highest level of international cooperation. These countries, despite their small production, these developing countries publish papers with international cooperation. These collaborations with other countries can allow researchers to study with colleagues who speak English and to receive study opportunities in the future. Citations increase as a result of international collaboration.[58,59] Researchers from different countries participating in an international study indicate a high standard of study and can facilitate future international collaborations.[60] Collaborative research at the international level provides investigators with a new opportunity to maximize their importance and interact with other researchers.[61]
The analysis of clusters of coauthorship based on authorsâ countries showed that most countries had a cooperative relationship with the United States. Co-authorship clusters can reveal the structure and dynamics of scientific communities, as well as the patterns of knowledge production and diffusion.[62] Furthermore, collaborations expand and enrich a researcherâs professional network, making it more extensive and interconnected.[63]
The keyword analysis highlights essential research areas and explains how they are interconnected. Thus, keyword analysis is an essential part of understanding trends in research.[64,65] In Garfield (1990), the statistical analysis of keywords was advocated for identifying research focus areas and predicting research trends within a discipline.[66] A hot spot in a discipline can be identified by counting the frequency with which keywords appear.[49,67] For the development of terminology, author keywords can be beneficial.[68] On the other hand, Keyword Plus are terms automatically derived from the titles of cited articlesâ references but do not appear in the articleâs title.[69] Keywords plus are more descriptive than author keywords. Keyword plus can express the contents of articles more concisely.[64] Keywords plus and author keywords are commonly used as analysis units regarding the bibliometric analysis of the knowledge structure of scientific fields, Keywords Plus is equally effective as Author Keywords. But it does not provide a comprehensive representation of an articleâs content. Although there is limited research evidence of keywords plusâ effectiveness, Garfield believes that Keywords Plus terms are more effective at capturing the depth and variety of content in an article.[70] In this study, we analyzed both cases for a comprehensive review.According to the findings, the trend of keywords has changed in recent years. The usability testing studies have improved over time, and words and themes that were not previously used in the studies have begun to be considered over the years. Keywords are mainly selected based on the mesh. This is due to the special emphasis of journals on the selection of keywords and for more visibility of publications.[71,72]
Limitations
While this is the first bibliometric analysis of usability studies to our knowledge, and it does draw important conclusions from its examination of the papers in this field, it does have some limitations. Data were extracted from WOS only, and other databases were not examined. Consequently, publications only indexed in other databases may have been missed. There may also be a language bias; although there were no linguistic requirements placed on the publications in our study, however, the majority of WOS publications are written in English. In addition, this study primarily focused on journals and excluded other scientific publications (such as books, proceedings papers, and reports). Consequently, several significant studies, particularly emerging research, may have been missed. Furthermore, the quality of WOS publications varies widely. Weighted analysis of publications based on quality evaluation was beyond the scope of our investigation. Consequently, our study may have accorded equal weight to publications of varying quality.
Future Directions
According to the findings, the scope of usability testing is wide. Usability testing has a broad application in various domains, especially in health information technology and communication. Future research can explore more aspects of usability testing based on the following summary:
To get a more comprehensive view of the global state of usability testing publications, future studies could include publications in languages other than English.
Future studies could also explore other databases as a potential avenue for further research.
Another possible direction for future research is to evaluate the quality of publications and conduct a weighted analysis of publications.
Another possible direction for future research is to explore how usability testing can enhance the benefits of health information technology.
Another possible direction for future research is to investigate the best practices and criteria for usability testing.
Exploring how usability testing can help communicate the value of HIT products or services, particularly in emerging markets, could be a significant topic for future research.
CONCLUSION
The current study reveals the most recent hotspots and trends in research and the status of global collaboration in usability testing research. The findings identified prominent countries, institutions, journals, original articles, and authors to indicate the most influential research channels. In order to analyse both performance and citation networks, several bibliometric methods were implemented. Usability testing has become more popular in recent years, as shown by the rising number of publications on various topics related to this paper. The main driver of this trend was the development and demand for health information technology and the necessity of the interaction between humans and computers. In other words, Usability testing has recently shown promising applications in information systems research and human-computer interaction. This studyâs findings may prove helpful in shaping the direction of future studies on usability testing, we believe.
Cite this article
Baghini MS, Mohammadi M, Norouzkhani N. Usability Testing: A Bibliometric Analysis Based on WoS Data. J Scientometric Res. 2024;13(1):9-24.
References
- Schall MCJ, Cullen L, Pennathur P, Chen H, Burrell K, Matthews G, et al. Usability Evaluation and Implementation of a Health Information Technology Dashboard of Evidence-Based Quality Indicators. Comput Inform Nurs. 2017;35(6):281-8. [Google Scholar]
- Bates DW, Gawande AA. Improving safety with information technology. N Engl J Med. 2003;348(25):2526-34. [Google Scholar]
- Kim MO, Coiera E, Magrabi F. Problems with health information technology and their effects on care delivery and patient outcomes: a systematic review. J Am Med Informatics Assoc. 2017;24(2):246-50. [Google Scholar]
- Coiera E, Ash J, Berg M. The unintended consequences of health information technology revisited. Yearb Med Inform. 2016;25(01):163-9. [Google Scholar]
- National Academies of Sciences Engineering Medicine. Taking Action Against Clinician Burnout: A Systems Approach to Professional Well-Being. Natl Acad Press. 2019:334 [Google Scholar]
- Carayon P, Hoonakker P. Human factors and usability for health information technology: old and new challenges. Yearb Med Inform. 2019;28(01):71-7. [Google Scholar]
- Yen P-Y. Columbia University. Health Information Technology Usability Evaluation: Methods, Models, and Measures [Internet]. Health Information Technology Usability Evaluation: Methods, Models, and Measures. 2010:145 Available fromhttp://gradworks.umi.com/3420882.pdf
[Google Scholar] - Ebnehoseini Z, Tara M, Meraji M, Deldar K, Khoshronezhad F, Khoshronezhad S, et al. Usability Evaluation of an Admission, Discharge, and Transfer Information System: A Heuristic Evaluation. Open access Maced J Med Sci. 2018;6(11):1941-5. [Google Scholar]
- Mugisha A, Nankabirwa V, TylleskÀr T, Babic A. A usability design checklist for Mobile electronic data capturing forms: the validation process. BMC Med Inform Decis Mak [Internet]. 2019;19(1):4 Available fromhttps://pubmed.ncbi.nlm.nih.gov/30626390
[Google Scholar] - Sinabell I, Ammenwerth E. Agile, Easily Applicable, and Useful eHealth Usability Evaluations: Systematic Review and Expert-Validation. Appl Clin Inform [Internet]. 2022/03/09. 2022;13(1):67-79. Available fromhttps://pubmed.ncbi.nlm.nih.gov/35 263798
[Google Scholar] - Bevan N, Carter J, Harker S. ISO 9241-11 revised: What have we learnt about usability since 1998?. 2015:143-51. [Google Scholar]
- Yao P, Gorman PN. Discount usability engineering applied to an interface for Web-based medical knowledge resources. In: Proceedings of the AMIA Symposium. 2000:928 [Google Scholar]
- Marcilly R, Kushniruk AW, Beuscart-Zephir M-C, Borycki EM. In: Digital Healthcare Empowering Europeans. 2015:115-9. [Google Scholar]
- Force HUT. Promoting Usability in Health Organizations: initial steps and progress toward a healthcare usability maturity model. Heal Inf Manag Syst Soc. 2011 [Google Scholar]
- Farzandipour M, Sadeqi Jabali M, Nickfarjam AM, Tadayon H. Usability evaluation of selected picture archiving and communication systems at the national level: Analysis of usersâ viewpoints. Int J Med Inform [Internet]. 2021;147:104372 Available fromhttps://www.sciencedirect.com/science/article/pii/S1386505620319080
[Google Scholar] - Walden A, Garvin L, Smerek M, Johnson C. User-centered design principles in the development of clinical research tools. Clin Trials. 2020;17(6):703-11. [Google Scholar]
- Agharezaei Z, Khajouei R, Ahmadian L, Agharezaei L. Usability evaluation of a laboratory information system. Dir Gen. 2013;10(2):1-12. [Google Scholar]
- Tase A, Vadhwana B, Buckle P, Hanna GB. Usability challenges in the use of medical devices in the home environment: A systematic review of literature. Appl Ergon. 2022;103:103769 [Google Scholar]
- Hyzy M, Bond R, Mulvenna M, Bai L, Dix A, Leigh S, et al. System Usability Scale Benchmarking for Digital Health Apps: Meta-analysis. JMIR mHealth uHealth. 2022;10(8):e37290 [Google Scholar]
- Zanatta F, Giardini A, Pierobon A, DâAddario M, Steca P. A systematic review on the usability of robotic and virtual reality devices in neuromotor rehabilitation: patientsâ and healthcare professionalsâ perspective. BMC Health Serv Res. 2022;22(1):523 [Google Scholar]
- Seow W-L, Md Ariffin UK, Lim SY, Mohamed NA, Lee KW, Devaraj NK, et al. A Systematic Review on the Usability of Web-Based Applications in Advocating Consumers on Food Safety. Foods. 2022;11(1):115 [Google Scholar]
- Muro-Culebras A, Escriche-Escuder A, Martin-Martin J, Roldån-Jiménez C, De-Torres I, Ruiz-Muñoz M, et al. Tools for evaluating the content, efficacy, and usability of mobile health apps according to the consensus-based standards for the selection of health measurement instruments: systematic review. JMPatel B, A Thind, Usability of mobile health apps for postoperative care: systematic review JMIR Perioper Med. 2020;3(2):e19099 [Google Scholar]
- Adesina N, Dogan H, Green S, Tsofliou F. Effectiveness and Usability of Digital Tools to Support Dietary Self-Management of Gestational Diabetes Mellitus: A Systematic Review. Nutrients. 2021;14(1):10 [Google Scholar]
- Alvarez-Peregrina C, Sanchez-Tena MA, Martin M, Villa-Collar C, Povedano-Montero FJ. Multifocal contact lenses: A bibliometric study. J Optom [Internet]. 2020/09/07. 2022;15(1):53-9. Available fromhttps://pubmed.ncbi.nlm.nih.gov/32907788
[Google Scholar] - Aria M, Cuccurullo C. bibliometrix: An R-tool for comprehensive science mapping analysis. J Informetr. 2017;11(4):959-75. [Google Scholar]
- Peng C, He M, Cutrona SL, Kiefe CI, Liu F, Wang Z, et al. Theme Trends and Knowledge Structure on Mobile Health Apps: Bibliometric Analysis. JMIR Mhealth Uhealth [Internet]. 2020;8(7):e18212 Available fromhttp://www.ncbi.nlm.nih.gov/pubmed /32716312
[Google Scholar] - Guler AT, Waaijer CJF, Palmblad M. Scientific workflows for bibliometrics. Scientometrics [Internet]. 2016;107(2):385-98. Available fromhttps://doi.org/10.10 07/s11192-016-1885-6
[Google Scholar] - Povedano-Montero FJ, Ălvarez-Peregrina C, Hidalgo Santa Cruz F, Villa-Collar C, SĂĄnchez Valverde J. Bibliometric Study of Scientific Research on Scleral Lenses. Eye Contact Lens [Internet]. 2018:44 Available fromhttps://journals.lww.com/claojourn al/Fulltext/2018/11002/Bibliometric_Study_of_Scientific_Research_on.48.aspx
[Google Scholar] - Donthu N, Kumar S, Mukherjee D, Pandey N, Lim WM. How to conduct a bibliometric analysis: An overview and guidelines. J Bus Res [Internet]. 2021;133:285-96. Available fromhttps://www.sciencedirect.com/science/article/pii/S0148296321003155
[Google Scholar] - Zupic I, Äater T. Bibliometric Methods in Management and Organization. Organ Res Methods [Internet]. 2014;18(3):429-72. Available fromhttps://doi.org/10.1177/1094428114562629
[Google Scholar] - Bawack RE, Wamba SF, Carillo KDA, Akter S. Artificial intelligence in E-Commerce: a bibliometric study and literature review. Electron Mark [Internet]. 2022;32(1):297-338. Available fromhttps://doi.org/10.1007/s12525-022-00537-z
[Google Scholar] - Collection C. [cited 2022 Jul 26];WEB OF SCIENCE Âź CORE COLLECTION (W of Science) Web of Science Core Collection. Web Sci [Internet]. 2022 Available fromhttps://clarivate.com/webofsciencegroup/solutions/web-of-science-core-collection/
[Google Scholar] - Collection W of SC. Web of Science Core Collection Overview [Internet]. 2021 Available fromhttps://webofscience.help.clarivate.com/en-us/Content/wos-core-c ollection/wos-core-collection.htm
- Grzybowska K, Awasthi A. Literature review on sustainable logistics and sustainable production for Industry 4.0. Sustain Logist Prod Ind 40. 2020:1-18. [Google Scholar]
- Milian EZ, Spinola M de M, Carvalho MM de. Fintechs: A literature review and research agenda. Electron Commer Res Appl [Internet]. 2019;34:100833 Available fromhttps://www.sciencedirect.com/science/article/pii/S1567422319300109
[Google Scholar] - Jan N, Eck V, Waltman L. [cited 2022 Aug 30];VOSviewer Manual [Internet]. Manual for VOSviewer version 1. 6 . 1 8. 2022 Available from: chrome-extension://efaidnbmn nnibpcajpcglclefindmkaj/https://www.vosviewer.com/documentation/Manual_VO Sviewer_1.6.18.pdf [Google Scholar]
- Komperda R. In: Computer-Aided Data Analysis in Chemical Education Research (CADACER): Advances and Avenues. 2017:91-116. [Google Scholar]
- Levis JR. IBM Computer Usability Satisfaction Questionnaires – Psychometric Evaluation And Instructions For Use. Int J Hum Comput Interact. 1995;7(1):57-78. [Google Scholar]
- Burgess S, MartĂn-MartĂn P. Linguistic recycling and its relationship to academic conflict: An analysis of authorsâ responses to direct quotation. AILA Rev. 2020;33(1):47-66. [Google Scholar]
- Somasundaram R. [cited 2022 Jul 31];Top 100 Journals in the World with Highest Impact Factor 2022 – iLovePhD [Internet]. iLovePhD. 2022 Available fromhttps://www.ilovephd.com/top-100-journals-in-the-world-with-impact-factor/
[Google Scholar] - JMIR. [cited 2022 Jul 31];JMIR mHealth and uHealth [Internet]. 2022 Available fromhttps://mhealth.jmir.org/
[Google Scholar] - JMIR. [cited 2022 Jul 31];Journal of Medical Internet Research [Internet]. 2022 Available fromhttps://www.jmir.org/
[Google Scholar] - Forliano C, De Bernardi P, Yahiaoui D. Entrepreneurial universities: A bibliometric analysis within the business and management domains. Technol Forecast Soc Change [Internet]. 2021;165:120522 Available fromhttps://www.sciencedirect.com /science/article/pii/S0040162520313482
[Google Scholar] - Reyes-Gonzalez L, Gonzalez-Brambila CN, Veloso F. Using co-authorship and citation analysis to identify research groups: a new way to assess performance. Scientometrics [Internet]. 2016;108(3):1171-91. Available fromhttps://doi.org/10.1007/s11192-016-2029-8
[Google Scholar] - Mannocci A, Osborne F, Motta E. Geographical trends in academic conferences: An analysis of authorsâ affiliations. Data Sci. 2019;2(1-2):181-203. [Google Scholar]
- SCImago Institutions Rankings. SCImago Institutions Rankings. 2022 [cited 2022 Jul 31]. Available fromhttps://www.scimago ir.com/rankings.php?sector=Highereduc
University Rankings 2022 [Internet]. - Colombino E, Prieto-Botella D, Capucchio MT. Gut Health in Veterinary Medicine: A Bibliometric Analysis of the Literature. Animals. 2021;11(7):1997 [Google Scholar]
- Wang M, Chai L. Three new bibliometric indicators/approaches derived from keyword analysis. Scientometrics [Internet]. 2018;116(2):721-50. Available fromhttps://doi.org/10.1007/s11192-018-2768-9
[Google Scholar] - National Library of Medicine. National Library of Medicine. 2021 [cited 2022 Aug 13]. Available fromhttps://www.nlm.nih.gov/mesh/meshhome.html?_gl=1*423gxf*_ga*Mzk4OTIwMTQuMTU5NT E0MDUyMA..*_ga_P1FPTH9PL4*MTY2MDM3NDU5OS4xLjEuMTY2MDM3NTExNy4w
Welcome to Medical Subject Headings [Internet]. - Tandon A, Kaur P, MÀntymÀki M, Dhir A. Blockchain applications in management: A bibliometric analysis and literature review. Technol Forecast Soc Change. 2021;166:120649 [Google Scholar]
- Van Eck N, Waltman L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics. 2010;84(2):523-38. [Google Scholar]
- Wang K, Herr I. Machine-Learning-Based Bibliometric Analysis of Pancreatic Cancer Research Over the Past 25 Years. Front Oncol. 2022:12 [Google Scholar]
- De Stefano D, Giordano G, Vitale MP. Issues in the analysis of co-authorship networks. Qual Quant. 2011;45(5):1091-107. [Google Scholar]
- Luong D-H, Nguyen X-A, Ngo T-T, Tran M-N, Nguyen H-L. Social Media in General Education: A Bibliometric Analysis of Web of Science from 2005-2021. J Scientometr Res. 2023;12(3):680-90. [Google Scholar]
- Agarwal A, Durairajanayagam D, Tatagari S, Esteves SC, Harlev A, Henkel R, et al. Bibliometrics: tracking research impact by selecting the appropriate metrics. Asian J Androl. 2016;18(2):296-309. [Google Scholar]
- Gyau EB, Sakuwuda K, Asimeng E. A Comprehensive Bibliometric Analysis and Visualization of Publications on Environmental Innovation. J Scientometr Res. 2023;12(3):544-57. [Google Scholar]
- Sweileh WM, AbuTaha AS, Sawalha AF, Al-Khalil S, Al-Jabi SW, Zyoud SH, et al. Bibliometric analysis of worldwide publications on multi-, extensively, and totally drug – resistant tuberculosis (2006-2015). Multidiscip Respir Med. 2016;11:45 [Google Scholar]
- Tahamtan I, Safipour Afshar A, Ahamdzadeh K. Factors affecting number of citations: a comprehensive review of the literature. Scientometrics. 2016;107(3):1195-225. [Google Scholar]
- AĆkun V, Cizel R. Twenty Years of Research on Mixed Methods. 2020;1:28-43. [Google Scholar]
- Grubbs JC, Glass RI, Kilmarx PH. Coauthor country affiliations in international collaborative research funded by the US National Institutes of Health, 2009 to 2017. JAMA Netw Open. 2019;2(11):e1915989-e1915989. [Google Scholar]
- . Intelligent Systems Reference Library [Internet]. 2019:179-92. Available fromhttps://doi.org/10.1007/978-3-319-91196-0_5
[Google Scholar] - Mydin F, Rahman RSARA, Mohammad WMRW. Research collaboration: enhancing the research skills and self-confidence of early career academics. Asian J Univ Educ. 2021;17(3):142-53. [Google Scholar]
- Tripathi M, Kumar S, Sonker SK, Babbar P. Occurrence of author keywords and keywords plus in social sciences and humanities research : A preliminary study. COLLNET J Sci Inf Manag [Internet]. 2018;12(2):215-32. Available fromhttps://doi.org/10.1080/09737766.2018.1436951
[Google Scholar] - Schodl K, Klein F, Winckler C. Mapping sustainability in pig farming research using keyword network analysis. Livest Sci. 2017;196:28-35. [Google Scholar]
- Garfield E. KeyWords Plus-ISIâs breakthrough retrieval method. 1. Expanding your searching power on current-contents on diskette. Curr contents. 1990;32:5-9. [Google Scholar]
- Huai C, Chai L. A bibliometric analysis on the performance and underlying dynamic patterns of water security research. Scientometrics. 2016;108(3):1531-51. [Google Scholar]
- NĂ©vĂ©ol A, DoÄan RI, Lu Z. Author keywords in biomedical journal articles. AMIA . Annu Symp proceedings AMIA Symp. 2010;2010:537-41. [Google Scholar]
- Clarivate. [cited 2022 Aug 15];KeyWords Plus generation, creation, and changes [Internet]. 2022
https://support.clarivate.com/ScientificandAcademicResearch/s/article/ KeyWords-Plus-generation-creation-and-changes?language=en_US
Available fromhttps://support.clarivate.com/ScientificandAcademic Research/s/article/KeyWords-Plus-generation-creation-and-changes?language=en _US
[Google Scholar] - Zhang J, Yu Q, Zheng F, Long C, Lu ZDZ. Comparing keywords plus of WOS and author keywords: A case study of patient adherence research. J Assoc Inf Sci Technol [Internet]. 2016;67(4):967-72. Available fromhttps://asistdl.onlinelibrary.wiley.com/ doi/full/10.1002/asi.23437
[Google Scholar] - Schilhan L, Kaier C, Lackner K. Increasing visibility and discoverability of scholarly publications with academic search engine optimization. Insights UKSG J. 2021;34(1):1-16. [Google Scholar]
- Carpenter CR, Cone DC, Sarli CC. Using publication metrics to highlight academic productivity and research impact. Acad Emerg Med. 2014;21(10):1160-72. [Google Scholar]