Know at a glance the keypoints of the Open Source universities Ranking.
We are likely to see more university rankings, not less. But the good news is they will become increasingly specialised
Simon Marginson, editor member of ranking Times Higher and adviser in ARWU ranking
It's a classification of universities according their commitment with the use, diffusion and creation of open source software. It's an specialized ranking that ranks educational centers with 18 indicators that measure the work they do in open learning from all sectors. It's have been done by an experts committee of the industry.
There are 4 characteristics evaluated for every university: Teaching criteria, Research criteria, Technological criteria and Webometrics. Each of these aspects have a certain number of indicators with their own weight. Here you can check our methodology.
Because universities enhance the open source software from many areas and it's necessary to have a standardized methodology to measure, compare and know their strengths and weaknesses. The OSuR offers the opportunity to learn in detail the impact of the actions taken, in what areas is best and how it spread the open source software in relation to other universities.
The OSuR appears in response to the need for universities to have a standard methodology for measuring homogeneously all universities under the same criteria, that make the information collected useful and accurate. The OSuR provides the answer to this need.
Universities enhance the open source software in many different ways. They have many initiatives and is not easy to know them all, because they are made from different education areas. So it's even more difficult to assess them. This implies that you can't know the real effort made in each university to make popular the free software.
To evaluate the work done by each university it's necessary to have an easy way to understand its strenghts and weaknesses; to know other universities initiatives and identify which aspects they should improve.
The main objective is to analyze the work done by universities in Open Source software making and difussion. We do this through a ranking based upon empirical concepts that offers information to measure and compare how much the Open Source software is supported in each university:1
There are universities with a strong commitment to free software, they deserve to be known and to be widespread and have recognition of their work for the common good.2
The criteria measure the spred of free software from all levels of the university. Thanks to that we can know which aspects are best and which ones have room to improvement.3
By introducing initiatives and areas that each university works in open source software. By this way they can collaborate with each other to improve wherever needed.
We intend to provide extra motivation for universities to devote more efforts to promote free software. So in them there are people with great knowledge and commitment with this philosophy, and so they can enhance open source software.
This ranking helps to release initiatives of each university in this area, and creates a methodology to assess it and compare it.
In the scope. While most ranking classify globally the universities for the impact of its scientific production, this ranking is based specifically on knowing the spreading of open source software. By focusing on a limited scope, the criteria used to rank universities are interrelated.
One of the most important parts is the definition of indicators is to decide what is valued for universities ranking.
There are many rankings to rank universities, but this one differs from the others in 4 aspects:
A ranking is a user-oriented process: is developed to be used by anyone. The definition of the aspects and indicators to study has been made considering those who contributed more users to ranking users:
1Analysis of public documents: the website of the institution
We work as accurately as possible to get correct data. Even so we cannot ensure 100% the accuracy of all information from all universities because it has not always been possible to have direct information from the institution. In order to verify the accuracy of the used data, we provide the spreadsheets with the results to download.
We don't try in any way to rank universities for the quality of their education, or the resource levels provided. This is not an academic ranking and should not be used as criteria for choosing an university. It's a ranking focused on a particular area of the university and whose purpose is to know the efforts made by each center in this area and help to improve them.
To choose a university there are many more indicators, not taken for this study (such as the number of teachers per student, research projects ...). Several rankings worldwide are responsible for assessing these other aspects, and although there is no single methodology to rank universities, can be found if the purpose is to consult an academic ranking. You can find them in our bibliography.
There is no public ranking that measures how universities enhance open source software. University rankings are mainly performed to evaluate them using educational criteria and serve to help students to choose a college.
Each ranking, according to their purpose, measures different aspects of universities. Som of them focus on cybermetrics (web) aspects , scientific, academic ... more about other university rankings.
Indicators that comprise our methodology are designed to measure the full range of activities on open source software that are made in a University. Like any other ranking, is a simple way to know the work done by universities, yet it is imperfect because there may be methods that use other factors. This is a common problem in all the rankings, and to maximize the degree of accuracy of our ranking methodology, it follows the principles set by the IREG (International Ranking Experts Group), a group of experts appointed by UNESCO who developed principles of Berlin to rank Superior Education Institutes [PDF] and that serve to unify the methodologies followed in preparing the rankings so you can compare them. Also it explain why we have chosen these factors and the benefits of the methodology we use.
In addition, weights given to each criterion have been chosen by vote of a group of professors and industry professionals. The study was fully funded by PortalProgramas.com, a software download site without any linkage with universities or other organization.
In scientific research, it's a basic condition that investigations are reproducible. To that end, OSuR provides information on the data used, sources and methodology.
The universities themselves have provided information on several criteria, which is also considered as a verified source.
The data sources are public web services (like search engines), manual data collection and data provided by the universities themselves. Both the criteria for classification as the data with which wrote the report have been released to have a high transparency. View work methodology.
The only criteria that can not be reproduced accurately due to its volatile nature, are the web metrics. Unlike citations in scientific publications, website links vary over time: may decrease when links, pages or entire websites are removed. So you can not make an accurate back-calculation of these criteria, but only a very rough estimate.
The information have been extracted by a manual analysis of the websites of all the universities that are part of the study, and from contacts with universities. Only external web services have been used for measurements of webometrics as they are very complex aspects to measure.
Nearly all university rankings consider the scientific production of a university as one of the criteria used for classification. For this ranking bibliometric indices have been adapted and used in other rankings webometrics to the scope of open source software. E.g., the 'number of citations' which is an important indicator of the impact it has had a scientific research, translates as 'external links' which is also an important indicator in open source software. In addition, we have considered other webometrics indicators from other rankings like the 'web impact'.
Because they are obtained through third party tools, outside the university, which count the number of pages, so their results are approximate: