Home  / Selected Indicators

Indicators to value for the ranking

They measure the level of compliance of each university regarding the strategic objective of broadcasting open source software

The secret to successful classification is the ability to ascertain the key characteristics on which the classification is to be based

Bailey, 1994

The criteria are an essential part of this ranking because they determine the information taken for each university at the time of assess it. Therefore we are following a methodology to identify the most important issues regarding the objective of the ranking and we test them to verify effectiveness.

The most important points have been having experts in open source software and development experts in university rankings, interviews with universities and testing of indicators.

Criteria design

This is a descriptive study through observation and analysis of documents according to the classification of Montero and León (2007) [PDF] and the the Principles of Berlin have been followed .

In report elaboration collaborate experts in statistical data analysis and experts on open source software to maximize the quality, accuracy and usefulness of the information obtained. Criteria were chosen to have a high impact on the empowerement open source software. Also numerical criteria are easily measurable, objective, and yet discriminative enough to be able to get very different values ​​to better separate the strengths and weaknesses of each institution.

To define the importance of each criterion we used a Likert scale with range 1-5 where several experts have determined collaboratively weight of each one.

What type of information measure the selected indicators?

We use quantitative indicators (to measure impact of actions carried out by the university) and qualitative ones (to have in mind aspects about the importance of each action). The indicators are grouped according to the type of information they provide to also know about diffusion from different areas of the university.

Criteria measure the results of the activity that takes the university in the field of open source software. This allows us to minimize dependence on subjective factors and focus on the aspects that are most important for open source software such as research, collaboration and knowledge transfer.

At all times, we have avoided the use of criteria that require subjective assessments because, although they may provide interesting information, so they are inaccurate and thus leads to errors. For instance, it would be interesting to measure not only the number of people in the universities open source software offices, but the implication that each one has. Similarly, knowledge of the teachers who teach courses about open source software is an useful concept. But as there is no objective measurement methodology, this information has been discarded.

Analysis period

All information has been obtained over the last 12 months since the beginning of the study, except on some criteria that specifically details the time range used and the reason. While not considering past information, this time range guarantees info about the current state of each university and allows future studies to compare evolution.

Temporary list of criteria

We are currently in interviews stage with universities (view the stages of development) and so the criteria list is not definitive or complete. We've described only some criteria in order to guide the reader about the kind of information that we use.

Aspect Criteria Indicator
Teaching criteria OCW Usage Is this institution a member of the OpenCourseWare Consortium to offer fee and openly licensed courses accesible by anyone via Internet?
GSoC enrollement Has this institution been enrolled in Google Summer of Code program any time in the period 2005-2010?
Number of students at GsoC Number of students enrolled in the Google Summer of Code between 2005-2010
OER Material Does this institution publish materials licensed under OER (Open Educational Resources)
Research criteria Open Source Science Project Member Is this institution supporting Open Science?
Jasic Member Is this institution supporting development of open source projects which benefit higher education?
Kuali member Is this institution supporting the creation of an open source financial system for higher education?
Technological criteria Web Server Is the server of the university's main webpage a free and open source server? Nginx, Apache...
Web contents license Does the university's webpage use any contents free license?
Uses Open Source VLE Is this institution using Sakai, Moodle (or any other open sourced system) as VLE (Virtual Learning Environment)?
Webometrics Impact Total number of external backlinks * total number of different domains containing those backlinks
MajesticMillion Website rating according to MajesticSEO
WIF (Web Impact Fact) Total number of external backlinks / total number of indexed pages
Dispersion's impact of the fresh citations Number of fresh different domains containing a backlink to the OCW
Dispersion's impact of the total citations Number of historic different domains containing a backlink to the OCW
Number of OCW fresh citations Number of fresh backlinks to their OCW
Number of OCW total citations Total number of backlinks to their OCW
Indexed pages Total number of OCW's indexed pages according to MajesticSEO

Webometrics

Today there is no single and accurate source for analyzing cybermetric indexes. There are even major indexes that could not be used because there is no information about it or, if they did, there is a standardized way of measuring it. Therefore, the information used here is an approximation of what really should be measured.

For instance, the criteria 'total visits', 'unique users', 'articles readings' ... are complicated indexes to measure because there is no clear and unique definition of each. That implies that different tools define differently each of these criteria and therefore each one provides different information about the same resource.

Because the tools used in webometrics use different methodologies to measure data, the results of each website will vary depending on the used tool , and this would benefit some places compared to others. To avoid this problem, this study has tested the best known webometrics services, and has it chosen the most reliable sources.

There are tens of web services to perform analysis webometrics (number of external links, indexed pages ...). Each has its own search engines, applies its own logic when searching links and get different results. Since none has scrutinized 100% of Internet. Therefore, the results to be used may vary depending on the service. To avoid the conditions of each service affecting this analysis, we only used data from one web service.

We have considered several sources: Google.com, Bing.com, OpenSiteExplorer.org, MajesticSeo.com, Searchmetrics. Tests have been done with all of them seeking information about 5 universities. MajesticSEO was the one which given more information so it 's taken as a reference. Furthermore, we have chosen it because it allows to collect different types of information: recent and historical. This allows us to measure the work of each FOSS bureau based on the work of the past few weeks and based on all the work done since it started. Only in the case of the geographical origin of the links Searchmetrics has been used '' since it is the only tool that provides this information.

Do you want to give your opinion?

If you have knowledge about open source software and / or university rankings and want to bring your experience to the team, we are open to listen your opinion.

 

Contact Us



About us:
PortalProgramas  |  About us