From Metrics to Action: What a 6-Year Case Study Teaches Universities About Improving Webometrics

Rankings are most useful when they help universities improve, not just compare. A 2025 open-access paper published in Discover Education analyzes Al Salam University’s Webometrics indicators over six years (2019 to 2024) and proposes a practical, phased strategy to strengthen performance across three criteria: Visibility (Impact), Transparency (Openness), and Excellence.

Read the full paper (PDF): A strategic framework for enhancing university rankings based on webometrics criteria: a descriptive-analytical approach

What did the researchers study, and why does it matter?

The authors focus on a practical challenge faced by many institutions, especially those in remote or resource-constrained contexts: limited digital infrastructure and weak online discoverability can reduce global visibility, even when meaningful academic work exists.

To ground their conclusions in evidence, the paper analyzes historical Webometrics indicator trends for Al Salam University over a six-year period (2019–2024), then translates those trends into an actionable framework that the authors describe as adaptable to other institutions. The analysis is structured to connect indicator changes with operational priorities that universities can realistically implement over time.

What evidence does the paper present from 2019 to 2024?

The paper provides year-by-year values for global rank, local rank, and three indicator values tracked in the study: Impact, Openness, and Excellence.

Year Global Rank Local Rank Impact Openness Excellence
2019 26,675 48 27,538 8,602 6,115
2020 27,210 49 28,245 5,819 6,626
2021 27,137 48 27,095 6,488 6,650
2022 27,171 48 27,131 6,492 6,650
2023 27,344 56 27,226 8,183 7,237
2024 22,766 42 30,558 2,276 6,967

What changed in 2024, and why is it important?

The paper identifies 2024 as the strongest overall year in the series, with Al Salam University reaching a global rank of 22,766 and a local rank of 42. The most dramatic improvement appears in Openness, which shifted from 8,183 in 2023 to 2,276 in 2024.

The authors interpret this improvement as evidence that open-access practices and stronger awareness around sharing research outputs can produce measurable gains. They also note that not all indicators move together: the Impact indicator shows weaker performance in 2024, suggesting that external visibility and third-party referencing still required focused work.

How does the paper connect Webometrics indicators to operational causes?

A core strength of the study is that it does not treat ranking indicators as abstract numbers. It links indicator performance to practical institutional realities, such as:

  • Visibility (Impact): The authors connect weaker impact performance with reduced online influence and link profile limitations and argue for sustained digital outreach and credibility-building actions.
  • Transparency (Openness): The paper highlights the role of open-access publishing behaviors and researcher presence on academic platforms in shifting openness results.
  • Excellence: The study describes relative stability in excellence values across years, with fluctuations that the authors interpret as changes in publication quality and influence patterns.

What methods did the researchers use to support their conclusions?

The paper combines trend analysis of Webometrics historical results (2019–2024) with strategic diagnostics designed to explain why indicators move. The authors describe using a descriptive analytical approach, SWOT analysis, and situation analysis. They also report using external analytics tools such as Majestic and Ubersuggest to support backlink and visibility diagnosis.

This layered approach strengthens the credibility of the framework because it connects indicator movement with measurable digital signals and strategic context, rather than relying on assumptions.

What framework does the paper propose?

The authors propose a data-driven framework designed to be practical for universities with limited resources. The model targets three Webometrics-aligned pillars in an integrated way: Visibility (Impact), Transparency (Openness), and Excellence. The paper describes the framework as adaptable for any institution that wants to translate ranking indicators into operational improvement steps.

Quote from the paper: “The framework is designed to be adaptable to any institution.”

What implementation timeline does the paper recommend?

The paper outlines a phased timeline:

  • Phase 1 (0–6 months): actions related to website readiness, SEO, visibility activation, and link-building foundations.
  • Phase 2 (6–12 months): actions focused on transparency and openness, including profile consistency and open-access publishing support.
  • Phase 3 (1–3 years): longer-term excellence building through partnerships, training, and deeper institutional research capacity strengthening.

What measurable targets does the paper include?

The paper includes measurable targets that are meant to be tracked during implementation. Examples include increasing website traffic and backlinks within the first year and ensuring that faculty have updated profiles on academic platforms such as Google Scholar, ResearchGate, and ORCID.

Why does the paper emphasize infrastructure and access?

The authors argue that network and website stability are foundational, because discoverability, indexing, content access, and research visibility all depend on reliable infrastructure. In resource-constrained environments, this emphasis is practical: improvements in openness and visibility are difficult to sustain if access to institutional web assets is inconsistent.

How does the paper address fairness and limitations?

The paper acknowledges that language, resource gaps, and unequal digital capacity can influence web-based indicators. It also recognizes that web indicators can be misused if institutions attempt shortcuts. The framework presented in the study focuses on sustainable improvements rooted in openness, credibility, and research quality rather than manipulation.

Why this paper is worth reading

This study contributes to the Webometrics community by converting indicators into an operational plan supported by multi-year evidence. Its case-based approach is particularly valuable for universities that want a realistic improvement path that matches institutional capacity, while staying aligned with the three pillars the paper uses: visibility, openness, and excellence.

References

  • Gasmalla, K., Almamoun, O., & Elsiddig, J. (2025). A strategic framework for enhancing university rankings based on webometrics criteria: a descriptive-analytical approach. Open-access PDF

Last Updated: January 24, 2026

webometrics.org