This is the first installment in a series on impact measurement for Scholarship of Teaching and Learning (SoTL) researchers.  

This post is intended for individuals who publish in SoTL journals, contribute chapters and/or write books about SoTL, and/or attend SoTL conferences. Works of this type are considered traditional scholarship—academic treatises, grounded in the literature of a field and contributing novel ideas to the scholarly conversation.  

Citation Analysis Example 

In this section, I encourage you to follow along using an example from your own work. With thanks to Dr. Jill McSweeney, Assistant Director of the Center for the Advancement of Teaching and Learning and Assistant Professor of Wellness, who kindly volunteered for this activity, this post will use the following example: 

McSweeney, Jill, and Matthew Schnurr. 2023. “Can SoTL Generate High Quality Research While Maintaining Its Commitment to Inclusivity?” International Journal for the Scholarship of Teaching and Learning 17 (1): 1–15. https://doi.org/10.20429/ijsotl.2023.17104

A common measure of scholarly attention is citations—specifically the number of citations a work has acquired since its appearance in scholarly literature. Citation rates differ by field (Radicchi, Fortunato, and Castellano 2008; Wang and Barabási 2021). Many publications never collect a single one (Van Noorden, Maher, and Nuzzo 2014). How many times have McSweeney and Schnurr been cited since their publication in 2023? Some places to check include Google Scholar, Semantic Scholar, and Web of Science (which requires a subscription).

As of this writing, Google Scholar reports seven citations for this article (Figure 1). Some people are surprised to learn that Google Scholar is not the only option for citation tracking. In fact, Google Scholar has been criticized for its methods in compiling citation data (Doğan, Şencan, and Tonta 2016). Why do reports on citation counts differ? Different platforms look at different collections of information, giving rise to different views of the scholarly landscape.  

Screenshot of academic journal search result on Google Scholar.
Figure 1. Screenshot of “Can SoTL Generate High Quality Research while Maintaining its Commitment to Inclusivity” article in Google Scholar. 

Let’s examine the same article elsewhere. According to Semantic Scholar, this article (Figure 2) has been cited four times as of this writing, one of which (Loo et al. 2024) is marked as “highly influential.” 

Figure 2. Screenshot of “Can SoTL Generate High Quality Research while Maintaining its Commitment to Inclusivity” article in Semantic Scholar. 

Semantic Scholar uses machine learning to uncover context around cited works. Other tools (Web of Science, Scite) incorporate an element of nuance into the otherwise cold tally of citation counts. These distinctions show that not all citations are equal. Is a work briefly noted, or is it foundational to the citing paper? This “highly influential” citation seems to belong to the latter category.  

Loo et al.’s paper (2024) is newer and hasn’t been cited yet, but tracking its progression could yield interesting results. If others cite this second paper, that’s arguably a win for McSweeney and Schnurr’s 2023 work—the former would not have existed without influence from the latter.  

Explore Your Work

  1. Can you find your published work in Google Scholar? Try one of the other options too. Is there a difference in the total citation tally? 
  2. Who has cited your work? In what context was your scholarship included? What does this tell you about the conversation happening around your work? 

Context and Authorship 

Publications can be solo projects or team efforts. If the latter: what were your contributions, and how did they shape the finished product? Author credits are multidimensional. Perhaps you coauthored a piece with a preeminent scholar, or you may have worked with your own students on a project.  

Looking at our example (McSweeney and Schnurr 2023), we might ask each author to consider how they worked together to produce this article. A contribution framework like the CRediT taxonomy may be of use. Jessie Moore also writes about authorship and author order in SoTL projects in previous CEL blog posts.

For numerical approximations of author-level influence, we could look at the h-index of each contributor. The h-index is h equals the number of articles (n) that have been cited at least n times (Hirsch 2005; Vinyard and Colvin 2023). These platforms report on h-index values as well as other calculations: Google Scholar, Semantic Scholar, and Web of Science (requires a subscription). 

According to Google Scholar, Dr. McSweeney has an h-index of 8 (meaning she has 8 papers with at least 8 citations). Dr. Schnurr has an h-index of 19. 

Explore Your Work

  1. With whom have you published? What were those collaborative experiences like?  
  2. Can you find your h-index, or another author-level calculation of impact? Try looking up the same for your coauthors, editors, and/or research colleagues.  

Journals and More 

Academic journals, conferences, and publishers also carry their own impact metrics. A commonly used one is Journal Impact Factor (JIF), which measures how often a journal’s articles are cited over a certain time period (Vinyard and Colvin 2023). JIF can be viewed (with a subscription) in Clarivate’s Journal Citation Reports (JCR), though many individual websites highlight it, and several free alternatives exist. Scimago is a popular alternative. 

The journal that published the example article (International Journal for the Scholarship of Teaching and Learning) doesn’t appear in either JCR or Scimago. This is common in SoTL. If this occurs, consider: 

  • Why did you pick this venue? What does it mean for you to publish with them?  
  • What is their place in the SoTL community?  
  • What do you know about its publisher and their reputation?  

For illustration, let’s look at a different article: 

McSweeney, Jill M., and R. E. Moore. 2023. “Understanding the Impact of a Pandemic on the Work of Educational Developers.” International Journal for Academic Development, December, 1–14. https://doi.org/10.1080/1360144X.2023.2297412

This journal (International Journal for Academic Development) is in Scimago (Figure 3). According to Scimago, IJAD has an h-index of 36. Scrolling through its Scimago profile reveals additional metrics that can help illuminate the impact of McSweeney and Moore’s 2023 work. 

Figure 3. Screenshot of the International Journal for Academic Development in the Scimago platform. 

Explore Your Work

  1. Which journals do you enjoy reading and/or contributing to? Can you find them in any of the places mentioned above? 
  2. Which publishers or societies publish your work, and/or produce things that you find meaningful? What space(s) do they occupy in the SoTL community? 
  3. Which SoTL conferences do you enjoy attending? What is their standing in the field? If you’ve presented: what was the acceptance rate? 

The metrics covered in this section can form the core of your research story. The next part of this series will move away from traditional work and methods to the more unconventional.  


References 

Doğan, Güleda, İpek Şencan, and Yaşar Tonta. 2016. “Does Dirty Data Affect Google Scholar Citations?” Proceedings of the Association for Information Science and Technology 53 (1): 1–4. https://doi.org/10.1002/pra2.2016.14505301098

Hirsch, J. E. 2005. “An Index to Quantify an Individual’s Scientific Research Output.” Proceedings of the National Academy of Sciences 102 (46): 16569–72. https://doi.org/10.1073/pnas.0507655102

Loo, Daron Benjamin, Jonathan Tang Kum Khuan, Jinat Rehana Begum, and Deborah Choo. 2024. “Tensions at the University and Living in Liminality: English Language Teachers Navigating through New Research Expectations.” rEFLections 31 (3): 1023–43. https://doi.org/10.61508/refl.v31i3.276351

McSweeney, Jill M., and R. E. Moore. 2023. “Understanding the Impact of a Pandemic on the Work of Educational Developers.” International Journal for Academic Development, 1–14. https://doi.org/10.1080/1360144X.2023.2297412

McSweeney, Jill, and Matthew Schnurr. 2023. “Can SoTL Generate High Quality Research While Maintaining Its Commitment to Inclusivity?” International Journal for the Scholarship of Teaching and Learning 17 (1): 1–15. https://doi.org/10.20429/ijsotl.2023.17104

Radicchi, Filippo, Santo Fortunato, and Claudio Castellano. 2008. “Universality of Citation Distributions: Toward an Objective Measure of Scientific Impact.” Proceedings of the National Academy of Sciences 105 (45): 17268–72. https://doi.org/10.1073/pnas.0806977105

Van Noorden, Richard, Brendan Maher, and Regina Nuzzo. 2014. “The Top 100 Papers.” Nature 514 (7524): 550–53. https://doi.org/10.1038/514550a

Vinyard, Marc, and Jaimie Beth Colvin. 2023. Demystifying Scholarly Metrics: A Practical Guide. 1st ed. London: Bloomsbury Visual Arts. https://doi.org/10.5040/9798400639180

Wang, Dashun, and Albert-László Barabási. 2021. The Science of Science. 1st ed. Cambridge: Cambridge University Press. 


About the Author  

Ellen Cline is the Engineering & Physical Science Librarian at Elon University. She holds an MSLS from the University of North Carolina at Chapel Hill and previously served as a Research Librarian at Missouri University of Science & Technology. 

How to Cite This Post   

Cline, E. 2025. “Telling Your Research Story: Citations, H-index, and Journal Impact Factor in SoTL.” Center for Engaged Learning (blog). May 30, 2025. https://www.centerforengagedlearning.org/telling-your-research-story-citations-h-index-and-journal-impact-factor-in-sotl/.