• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Monday, 01 Sep 2014 10:20
Recently, there has been a lot of finger wagging about King Abdulaziz University (KAU), Jeddah, signing up highly cited researchers as secondary affiliations. The idea behind this was to climb up the ladder of the Shanghai rankings, the Academic Ranking of World Universities. These rankings include an indicator, based on Thomson Reuters' (TR) lists of highly cited researchers, which until now universities credit for those researchers who list them as secondary affiliation.

The Shanghai Ranking Consultancy decided that this year they would  count secondary affiliations in the old but not the new list "at the suggestion of many institutions and researchers including some Highly Cited Researchers".

It is possible that the highly cited researchers mentioned may have upset their primary affiliations who might have noticed that the indicator points accruing to KAU would come out of their own scores. Just counting the primary affiliations in the new list, meant that institutions such as Stanford, the Tokyo Institute of Technology, Indiana University Bloomington, the University of Sydney and the Indian Institute of Science have lost several points for this indicator.

The highly cited indicator is unique among the well known international rankings because when a researcher changes his or her  affiliation all of his or her papers go with him or her. It does not matter whether a university has employed a researcher for a day or a decade it will still get the same credit in this indicator. Everything depends on what the researcher puts down as his or her affiliation or affiliations.

All of this is just one manifestation of a problem that has been latent in academic publishing for some years, namely the issue of the affiliation that researchers use when submitting papers or articles. There has probably been quite a bit of small scale fiddling going on for years, with researchers with doctorates from selective universities giving those places as affiliations rather than the technical or education colleges where they are teaching or adjuncts picking the most prestigious of the several institutions where they work.

The best known case of creative affiliation  was that of Mohammed El Naschie whose publication career included questionable claims to affiliation with Cambridge, Frankfurt, Surrey and Alexandria Universities (see High Court of Justice Queen's Bench Division: Neutral Citation Number: [2012] EWHC 1809 (QB)).

Most of these claims did no one any good or any harm, apart from a little embarrassment. However, the Alexandria affiliation, combined with Thomson Reuters' distinctive method of counting citations and the university's relatively few publications, propelled Alexandria into the worlds top 5 for research impact and top 200 overall in the the 2010 Times Higher Education (THE) World University Rankings.

It is possible that many of the researchers who have signed up for KAU will start showing up in massively cited multi-contributor publications, many of them in physics, that will boost otherwise obscure places into the upper sections of the research impact indicator of the THE rankings.

TR have said that they did not count physics articles with more than 30 authors when they prepared their recent list of highly cited researchers. This might reduce the scores obtained by KAU, Panjab University and some other institutions if TR follow the same procedure in the coming THE world rankings. The issue, however, is not confined to physics.

It is time that journals, databases and ranking organisations began to look carefully at affiliations. At the least, journals should start checking claims and rankers might consider counting only one affiliation per author.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Sunday, 24 Aug 2014 01:17
The Shanghai Rankings have had a reputation for reliability and consistency. The latest rankings have, however, undermined that reputation a little. There have been two methodological changes of which one, not counting Proceedings Papers in the Nature and Science and Publications indicators, may not be of any significance. The other is the use of a new list of Highly Cited Researchers prepared by Thomson Reuters covering citations between 2002 and 2012. In this year's rankings this was combined with the old list which had not been updated since 2004.

One result of this is that there have been some very dramatic changes in the scores for Highly Cited Researchers this year. University of California Santa Cruz's score has risen from 28.9 to 37.9, Melbourne's from 24 to 29.3 and China University of Science and Technology's from 7.2 to 24.5 while that of the Australian National University has fallen from 32.3 to 24.8 and Virginia Polytechnic Institute's from 22.9 to 11.4.

This has had a noticeable impact on total scores. Santa Cruz has risen from the 101-150 band to 93rd place, Melbourne from 54th to 44th and China University of Science and Technology from the 201 - 300 band to the 150-200 band. The Australian National University has fallen from 66th place to 74th and Indiana University at Bloomington has dropped from 85th place to the 101-150 band.


In the top 20, this year's ARWU is more volatile than the two previous editions but still not as much as any other international ranking. The top 20 universities in 2013 rose or fell an average of 0.65 places.


RankingAverage Place Change
 of Universities in the top 20 
ARWU 2013 -2014   0.65
ARWU 2012-20130.25
ARWU 2011 - 20120.15
Webometrics 2013-20144.25
Center for World University Ranking (Jeddah)
2013-2014 
0.90
THE World Rankings 2012-20131.20
QS World Rankings 2012-20131.70


Looking at the top 100 universities, the ARWU is more volatile than last year's QS rankings with the average institution moving up or down 4.92 places.

RankingAverage Place Change
 of Universities in the top 100 
ARWU 2013 -2014   4.92
ARWU 2012-20131.66
ARWU 2011 - 20122.01
Webometrics 2013-201412.08
Center for World University Ranking (Jeddah)
2013-2014 
10.59
THE World Rankings 2012-20135.36
QS World Rankings 2012-20133.97







Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Sunday, 17 Aug 2014 18:25
Publisher

Center for World-Class Universities, Shanghai Jiao Tong  University


Scope

Global. 500 institutions.


Methodology

See ARWU site.

In contrast to the other indicators, the Highly Cited Researchers indicator has undergone substantial changes in recent years, partly as a result of changes by data provider Thomson Reuters. Originally, ARWU used the old list of highly cited researchers prepared by Thomson Reuters (TR), which was first published in 2001 and updated in 2004. Since them no names have been added although changes of affiliation submitted by researchers were recorded. 

Until 2011 when a researcher listed more than one institution as his or her affiliation then credit for the highly cited indicator would be equally divided. Following the recruitment of a large number of part time researchers by King Abdulaziz University, ARWU introduced a new policy of asking researchers how their time was divided. When there was no response, secondary affiliations were counted as 16%, which was the average time given by those who responded to the survey.

In 2013 TR announced that they were introducing a new list based on field-normalised citations over the period 2002-2012. However, problems with the preparation of the new list meant that it could not be used in the 2013 rankings. Instead, the Shanghai rankings repeated the 2012 scores.

During 2013, KAU recruited over 100 highly cited researchers who nominated the university as a secondary affiliation. That caused some comment by researchers and analysts. A paper by Lutz Bornmann and Johann Bauer concluded that to " counteract attempts at manipulation, ARWU should only consider primary institutions of highly cited researchers."

It seems that Shanghai has acted on this advice: "It is worth noting that, upon the suggestion of many institutions and researchers including some Highly Cited Researchers, only the primary affiliations of new Highly Cited Researchers are considered in the calculation of an institution’s HiCi score for the new list."

As a result, KAU has risen into the lower reaches of the 150-200 band on the basis of publications, some papers in Nature and Science and a modest number of primary affiliations among highly cited researchers. That is a respectable achievement but one that would have been much greater if the secondary affiliations had been included.


Perhaps Shanghai should also take note of the suggestion in a paper by Lawrence Cram and Domingo Docampo that  " [s]ignificant acrimony accompanies some published comparisons between ARWU and other rankings (Redden, 2013) driven in part by commercial positioning .  Given its status as an academic ranking , it may be prudent for ARWU to consider replacing its HiCi indicator with a measure that is nit sourced from a commercial provider if such a product can be found that satisfies the criteria (objective, open, independent ) used by ARWU."


.
Top Ten


PlaceUniversity
1Harvard
2Stanford
3MIT
4University of California Berkeley
5Cambridge
6Princeton
7California Institute of Technology (Caltech)
8Columbia
9=Chicago
9=Oxford



Countries With Universities in the Top 100



CountryNumber of Universities
United States   52
United Kingdom                                          8
Switzerland5
Germany4
France4
Netherlands4
Australia4
Canada4
Japan3
Sweden3
Belgium2
Israel2
Denmark2
Norway1
Finland1
Russia1
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Wednesday, 06 Aug 2014 21:36

The Webometrics rankings are based on web-derived data. They cover more than 22,000 institutions, far more than conventional rankings, and should always be consulted as a check on the plausibility of the others. They are, however, extremely volatile and that reduces their reliability considerably.
Publisher

Cybermetrics Lab, CSIC, Madrid



Scope

Global. 22,000+ institutions.


Methodology

From the Webometrics site.


The current composite indicator is now built as follows:
Visibility (50%)
IMPACT. The quality of the contents is evaluated through a "virtual referendum", counting all the external inlinks that the University webdomain receives from third parties. Those links are recognizing the institutioof conventinal prestige, the academic performance, the value of the information, and the usefulness of the services as introduced in the webpages according to the criteria of millions of web editors from all over the world. The link visibility data is collected from the two most important providers of this information:Majestic SEO and ahrefs. Both use their own crawlers, generating different databases that should be used jointly for filling gaps or correcting mistakes. The indicator is the product of square root of the number of backlinks and the number of domainsoriginating those backlinks, so it is not only important the link popularity but even more the link diversity. The maximum of the normalized results is the impact indicator.
Activity (50%)
PRESENCE (1/3). The total number of webpages hosted in the main webdomain (including all the subdomains and directories) of the university as indexed by the largest commercial search engine (Google). It counts every webpage, including all the formats recognized individually by Google, both static and dynamic pages and other rich files. It is not possible to have a strong presence without the contribution of everybody in the organization as the top contenders are already able to publish millions of webpages. Having additional domains or alternative central ones for foreign languages or marketing purposes penalizes in this indicator and it is also very confusing for external users.
OPENNESS (1/3). The global effort to set up institutional research repositories is explicitly recognized in this indicator that takes into account the number of rich files (pdf, doc, docx, ppt) published in dedicated websites according to the academic search engine Google Scholar. Both the total files Both the total records and those with correctly formed file names are considered (for example, the Adobe Acrobat files should end with the suffix .pdf). The objective is to consider recent publications that now are those published between 2008 and 2012 (new period).
EXCELLENCE (1/3). The academic papers published in high impact international journals are playing a very important role in the ranking of Universities. Using simply the total number of papers can be misleading, so we are restricting the indicator to only those excellent publications, i.e. the university scientific output being part of the 10% most cited papers in their respective scientific fields. Although this is a measure of high quality output of research institutions, the data provider Scimago groupsupplied non-zero values for more than 5200 universities (period 2003-2010). In future editions it is intended to match the counting periods between Scholar and Scimago sources.

Top Ten

1.    Harvard University
2.    MIT
3.    Stanford University
4.    Cornell University
5.    University of Michigan
6.    University of California Berkeley
7=   Columbia University
8=   University of Washington
9.    University of Minnesota
10.  University of Pennsylvania

Countries with Universities in the Top Hundred

USA                      66
Canada                  7
UK                          4  
Germany                3
China                      3
Japan                     2
Switzerland            2
Netherlands           1
Australia                1
Italy                         1
South Korea          1
Taiwan                   1 
Belgium                 1
Hong Kong            1
Brazil                      1 
Austria                   1
Czech Republic    1
Singapore             1        
Mexico                   1



Top Ranked in Region

USA:                             Harvard
Canada:                       Toronto
Latin America:             Sao Paulo
Caribbean                    University of the West Indes
Europe:                        Oxford
Africa:                           University of Cape Town
Asia:                             Seoul National University
South Asia                   IIT Bombay
Southeast Asia           National University of Singapore
Middle East:                Hebrew University of Jerusalem
Arab World:                 King Saud University
Oceania                       Melbourne

Noise Index
Average position change of universities in the top 20 in 2013:

4.25

Comparison

Center for World University Rankings         --  0.90
Shanghai Rankings (ARWU): 2011-12      --  0.15
Shanghai Rankings (ARWU) 2012-13       --  0.25
THE WUR:  2012-13                                    --  1.20
QS  WUR    2012-13                                    --  1.70  


Average position change of universities in the top 100 in 2013

12.08

Comparison

Center for World University Rankings               --  10.59 
 Shanghai Rankings (ARWU): 2011-12               --  2.01
 Shanghai Rankings   2012-13                            --  1.66
THE WUR:  2012-13                                            --   5.36
QS  WUR    2012-13                                            --   3.97


Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Thursday, 24 Jul 2014 11:31
Anyone interested in the evolution of global university rankings should take a look at 'Highly Cited researchers and the Shanghai Ranking' by Lawrence Cram of the Australian National University and Domingo Docampo of the University of Vigo, Spain.



  • The paper, which analyses the HiCi indicator, based on Thomson Reuters Highly Cited researchers database, in the Shanghai Academic Ranking of World Universities (ARWU), notes that the loss or addition of a single highly cited researcher can have a potentially large impact of the ranking of a university, although this would not apply to the top 100 where universities typically have one or two dozen such researchers.


  • The paper also provides some economic context that might explain why Thomson Reuters has been so adamant about allowing even the most sensible deviation of its Citation indicator from its InCites system.


"While the focus of this paper is not the commercial aspects of citations databases it is important to understand how commercial drivers might shape the availability of such data. In this respect, the most recently available shareholder earnings presentation for Thompson Reuters reveals a corporate strategy that includes resetting the cost base (i.e. reducing business costs), product simplification and attractive returns to shareholders. These business considerations may have a bearing on the commercial development of citation databases."



  • The paper describes the evolution of the Highly Cited Researchers database and some aspects of the new list introduced in 2013. They note that counting by field means that names are repeated and that the number of names in the new list may exceed the number of actual persons.

  • The number of highly cited researchers varies between fields, from 276 in Engineering to 401 in Microbiology.


  • For some universities, the score in the HiCi indicator in ARWU is very significant. For 13 universities -- 9 in the USA and 4 in Saudi Arabia -- it accounts for 30% of the total score.

  • The paper concludes by suggesting that it is time that ARWU considered changing its measurement of citations.


"Significant acrimony accompanies some published comparisons between ARWU and other rankings (Redden, 2013) driven in part by commercial positioning .  Given its status as an academic ranking , it may be prudent for ARWU to consider replacing its HiCi indicator with a measure that is nit sourced from a commercial provider if such a product can be found that satisfies the criteria (objective, open, independent ) used by ARWU."



Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Sunday, 20 Jul 2014 23:45
Five days ago I noted an open access paper  that analysed the new list of highly cited researchers published by Thomson Reuters (TR) and discussed the disproportionate number of secondary affiliations to a single institution, King Abdulaziz University (KAU) of Jeddah, Saudi Arabia.

The significance of this is that the number of highly cited researchers is an indicator in the Academic Ranking of World Universities (ARWU) produced by Shanghai Jiao Tong University, contributing 20% of the total score. If all affiliations given in the list are counted, including secondary affiliations, then KAU will get an extremely high score for this indicator in the forthcoming rankings and will do very well in the overall rankings.

Times Higher Education (THE) has now noticed what is going on. An article by Paul Jump reports on the findings  by Lutz Bornmann of the Max Planck Society  and Johann Bauer of the Max Planck Institute.

The THE article has been republished in Inside Higher Ed and there is more discussion in University World News by Yves Gingras of the University of Quebec.

That THE should publish such an article is a little surprising. The researchers who have listed KAU as a secondary affiliation will probably not stop with the the TR highly cited researchers' list -- if they do it will be a major scandal -- but will also include it in papers published in the highly regarded journals indexed in the Web of Science database. That could include those multi-"authored", hyper-cited publications in fields such as astronomy, genetics and particle physics that have propelled institutions such as Moscow State Engineering Physics Institute and Panjab University to high places in the Citations indicator in the THE World University Rankings.

KAU has already received a  high score for citations in the THE 2013 world rankings, one that is disproportionate to its very modest score for  the Research: Volume, Reputation and Income indicator. It is not impossible that its secondary affiliations in the old highly cited list will start popping up in the"author" list in one or two of those  Multi Author (or contributor?) Publications in this year's rankings as might those on the new lists in the years to come.

If so, one wonders whether it is fair to single out ARWU for criticism. However, TR has announced that  In the new highly cited list  that they  excluded physics papers with more than 30 institutional addresses. 

"The methodology described above was applied to all ESI fields with the exception of Physics. The relative large number of Highly Cited Papers in Physics dealing with high- energy experiments typically carried hundreds of author names. Using the whole counting method produced a list of high-energy physicists only and excluded those working in other subfields. For example, the number of Highly Cited Papers required for inclusion in Physics, using the standard methodology, turned out to be a remarkable 63. So, as an expedient, it was decided to eliminate from consideration any paper with more than 30 institutional addresses. This removed 436 out of 10,373 Highly Cited Papers in physics and the problem of overweighting to high-energy physics. An analysis without these papers produced a list in which the threshold for number of Highly Cited Papers was 14. It also produced a ranking in which the 2010 Nobel Prize winner in Physics Andre Geim of the University of Manchester appeared first, with 40 Highly Cited Papers. Fields of physics other than high-energy alone now appear as represented by the scientists selected."


If TR does that for this year's rankings as well they will save THE some embarrassment, although Panjab University may be wondering why its "excellence" has suddenly disappeared. We shall have to wait and see what happens.

There is an important point about the lists and their role in the rankings that needs to be made. The count of affiliations of KAU to in the arxiv paper is probably too high. The authors count the number of primary affiliations, then the total number of affiliations and also the fractionated total so that if a researcher has three affiliations then each counts as one third of an affiliation. 

However, as indicated in their methodology section , ARWU have surveyed the current highly cited researchers to determine how their affiliations and where there was no response assign secondary affiliations a 16% fraction, which was the percentage given by those who did respond to the survey. This would lead to a total for KAU less than that suggested by the paper although still implausibly high and well above the scores for the other indicators. 

Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Friday, 18 Jul 2014 07:09
Citations have become a standard feature of global university rankings, although they are measured in very different ways. Since 2003 the Shanghai Academic Ranking of World Universities has used the list of highly cited researchers published by Thomson Reuters (TR), who have now prepared a new list of about 3,500 names to supplement the old one which has 7,000 plus.

The new list got off to a bad a start in 2013 because the preliminary list was based on a faulty procedure and because of problems with the assigning of papers to fields or subfields. This led to ARWU having to repeat the 2012 scores for their highly cited researchers indicator in their 2013 rankings.

The list contains a number of researchers who appear more than once. Just looking at the number of Harvard researchers for a few minutes, I have noticed that David M Sabatini, primary affiliation MIT with secondary affiliations at Broad Institute Harvard and MIT, is listed  for Biology and Biochemistry and also for Molecular Biology and Genetics.

Eric S Lander, primary affiliations with Broad Institute Harvard and MIT and secondary affiliations with MIT and Harvard, is listed three times for  Biology and Biochemistry, Clinical Medicine and Molecular Biology and Genetics.

Frank B Hu, primary affiliation with Harvard and secondary affiliation with King Abdulaziz University, Saudi Arabia, is listed under Agricultural Sciences, Clinical Medicine and Molecular Biology and Genetics.

This no doubt represents the reality of scientific research in which a single researcher might well excel in two or more closely related fields but if ARWU are just going to count the number of researchers in the new list there will be distortions if some are counted more than once.

The new list refers to achievements over the period 2002-12. Unlike the old list, which just counted the number of citations, the new one is based on normalisation by field -- 21 in this case -- and by year of publication. In other words, it is not the number of citations that matters but the numbers in relation to the world average for field and year of citation.

TR acknowledge that there is a problem resulting from the growing number of massively cited, multi-authored papers and reviews, especially in the subfields of Particle and High-Energy Physics. To deal with this issue they have excluded from the analysis papers in Physics with more than thirty institutional addresses.

I do not know if TR are planning on doing this for their data for the Times Higher Education World University Rankings. If they are, places like Panjab University are in for a nasty shock.

Another noticeable thing about the new lists is the large number of  secondary affiliations. In many cases the joint affiliations seem quite legitimate. For example, there are many researchers in subjects such as Biology and Biochemistry with affiliation to an Ivy League school and a nearby hospital or research institute. On the other hand, King Abdulaziz University in Jeddah has 150 secondary affiliations. Whether Thomson Reuters or ARWU will be able to determine that these represent a genuine association is questionable.

The publication of the new lists is further evidence  that citations can be used to measure very different things. It would be unwise for any ranking organisation to use only one citations based indicator or only one database.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Thursday, 17 Jul 2014 22:59
The Center for World University Rankings, based in Jeddah, Saudi Arabia, has produced a global ranking of 1,000 universities. Last year and in 2012, 100 universities were ranked. The Center is headed by Nadim Mahassen, an Assistant Professor at King Abdulaziz University.

The rankings include five indicators that measure various aspects of publication and research: Publications in reputable journals, Influence (research papers in highly influential journals), Citations, Broad Impact (h-index) and Patents (h-index).

Altogether these have a weighting of 25%, which seems on the low side for modern world class research universities. The use of the h-index, which reduces the impact of outliers and anomalous cases, is a useful addition to the standard array of indicators. So too is the use of patents filed as a measure of innovation.

Another 25% goes to Quality of Education, which is measured by the number of alumni receiving major international awards relative to size (current number of students according to national agencies). There would appear to be an obvious bias here towards older institutions. There is also a problem that such awards are likely to be concentrated among relatively few universities so that this indicator would  not discriminate among  those outside the world elite.

A quarter is assigned to Quality of Faculty measured by the number of faculty receiving such awards and another quarter to Alumni Employment measured by the number of CEOs of top corporations.

The last three indicators are unlikely to be regarded as satisfactory. The number of CEOs is largely irrelevant to the vast majority of institutions.

In general, these are a useful addition to the current array of global rankings but the non-research indicators are narrow and not very meaningful. There is also a very serious problem with reliability as noted below. 

Now for the standard presentation of rankings, with the addition of a noise analysis. 


Publisher

Center for World University Rankings, Jeddah, Saudi Arabia


Scope

Global. 1,000 universities.

Methodology

Quality of Education (25%) measured by alumni winning international awards relative to size.
Alumni Employment  (25%) measured by CEOs of top companies relative to size.
Quality of Faculty (25%) measured by "academics" winning international awards relative to size.
Publications in reputable journals (5%).
Influence measured by publications in highly influential journals (5%).
Citations measured by the number of highly cited papers (5%).
Broad Impact measured by h-index (5%).
Patents measured by the number of international filings (5%)

Top Ten

1.   Harvard
2.   Stanford
3.   MIT
4.   Cambridge
5.   Oxford
6.   Columbia
7.   Berkeley
8.   Chicago
9.   Princeton
10. Yale

Countries with Universities in the Top Hundred

USA               54
Japan              8
UK                   7
Switzerland    4
France            4
Germany        4
Israel              3
Canada         3
China             2
Sweden         2
South Korea  1
Russia            1
Taiwan           1
Singapore     1
Denmark        1
Netherlands     1
Italy                 1
Belgium         1
Australia        1


Top Ranked in Region

USA:                             Harvard
Canada:                       Toronto
Latin America:             Sao Paulo
Europe:                        Cambridge
Africa:                           Witwatersrand
Asia:                             Tokyo
Middle East:                Hebrew University of Jerusalem
Arab World:                 King Saud University

Noise Index
Average position change of universities in the top 20 in 2013:

0.9

Comparison

Shanghai Rankings (ARWU): 2011-12  --  0.15; 2012-13 --  0.25
THE WUR:  2012-13  --   1.2
QS  WUR    2012-13  --   1.7

Average position change of universities in the top 100 in 2013

10.59

Comparison

Shanghai Rankings (ARWU): 2011-12  --  2.01; 2012-13 --  1.66
THE WUR:  2012-13  --   5.36
QS  WUR    2012-13  --   3.97

The CWUR rankings, once we leave the top 20, are extremely volatile, even more than the THE and QS rankings. This, unfortunately, is enough to undermine their credibility. A pity since there are some promising ideas here.


Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Wednesday, 16 Jul 2014 01:01

The demands on Western universities appear limitless.

Recently, the White House issued a plan to evaluate American universities according to the value that they offer. Proposals for indicators included the percentage of students receiving Pell grants, tuition fees, scholarships, student debt and graduate employment.

Not so long ago there were proposals that the US News and World Report's law school rankings should include the percentage of ethnic minority students.

Meanwhile, at the International Rankings Expert Group conference in London in June there were calls for universities to be ranked according to their contributions to environmental sustainability

The Universitas 21 rankings of higher education systems now include a 2% indicator for female academics and 2% for female students.

So universities are to be judged for admitting disadvantaged students, seeing that they, along with others, graduate, making sure that graduates get jobs, making sure that every classroom (well, maybe not in the humanities) is suitably diverse for race, gender and gender-orientation (although perhaps not, one suspects, for religion or politics) and contributing to environmental sustainability. No doubt there will be others: third mission, LGBT friendliness, community engagement, transformativeness? Doing research and providing instruction in disciplines and professions are no longer enough.

Now, The Upshot blog at the New York Times has gone  further. American colleges and universities are apparently responsible for the limited literacy, numeracy and problem solving skills of the entire population of the USA, whether or not they have been anywhere near a university.

The writer, Kevin Carey, claims that US primary and secondary schools are performing badly compared to their international counterparts. American students do badly on the Programme for International Student Assessment (PISA), tests run by the Organisation for Economic Cooperation and Development (OECD).

" (T)he standard negative view of American K-12 schools has been highly influenced by international comparisons. The Organization for Economic Cooperation and Development, for example, periodically administers an exam called PISA to 15-year-olds in 69 countries. While results vary somewhat depending on the subject and grade level, America never looks very good. The same is true of other international tests. In PISA’s math test, the United States battles it out for last place among developed countries, along with Hungary and Lithuania"
There are, as noted by blogger Steve Sailer, substantial variations by race/ethnicity in the 2012 PISA tests. The average score for the three tests -- mathematics, reading, science -- is 548 for Asian Americans, just below Hong Kong and just ahead of  South Korea, 518 for White Americans, the same as Switzerland, 465 for Hispanics, comfortably ahead of Chile and Costa Rica, and 434 for African Americans, well ahead of Brazil and Tunisia. The US education system is doing an excellent job of raising and keeping the skills of African Americans and Hispanics well above the levels of Africa, the Arab World and Latin America. Scores for Whites are comparable to Western Europe and those for Asian Americans to Greater China, except for Shanghai which is, in several respects, a special case.

What American schools have not done is to close the attainment gap between African Americans/Hispanics and Whites/Asians (probably meaning just Northeast Asians). Until the skills gap between Latin America and Africa on the one hand and Germany and Taiwan on the other is closed, this will not be a a unique failing.

The post goes on to observe that Americans think that their higher education system is superior because there are 18 and 19 US universities respectively in the top 25 in the Times Higher Education (THE) and the Shanghai rankings. It does not mention that in the latest Quacquarelli Symonds (QS) rankings the number goes down to 15.

"International university rankings, moreover, have little to do with education. Instead, they focus on universities as research institutions, using metrics such as the number of Nobel Prize winners on staff and journal articles published. A university could stop enrolling undergraduates with no effect on its score.
We see K-12 schools and colleges differently because we’re looking at two different yardsticks: the academic performance of the whole population of students in one case, the research performance of a small number of institutions in the other." 

This is a little inaccurate. It is correct that the Shanghai rankings are entirely research based. The THE rankings, however, do have a cluster of indicators that purport to have something to do with teaching although the connection with undergraduate teaching is tenuous since the teaching reputation survey is concerned with postgraduate supervision and there is an indicator that gives credit for the number of doctoral students compared to undergraduates. But the THE rankings,along with those published by QS, are not exclusively research orientated and there is no evidence that American universities are uniquely deficient in undergraduate teaching.

If a university stopped admitting undergraduate students its score on the THE and QS rankings would rise since it would do better on the staff student ratio indicator. Eventually, when all enrolled undergraduates graduate, it would be removed from the rankings since undergraduate teaching is a requirement for inclusion in these rankings.

Carey continues with a discussion of  the results of the PIAAC (Programme for the International Assessment of Adult Competencies) survey, conducted by the OECD.


"Only 18 percent of American adults with bachelor’s degrees score at the top two levels of numeracy, compared with the international average of 24 percent. Over one-third of American bachelor’s degree holders failed to reach Level 3 on the five-level Piaac scale, which means that they cannot perform math-related tasks that “require several steps and may involve the choice of problem-solving strategies.” Americans with associate’s and graduate degrees also lag behind their international peers.

American results on the literacy and technology tests were somewhat better, in the sense that they were only mediocre. American adults were eighth from the bottom in literacy, for instance. And recent college graduates look no better than older ones. Among people ages 16 to 29 with a bachelor’s degree or better, America ranks 16th out of 24 in numeracy. There is no reason to believe that American colleges are, on average, the best in the world."


It is true that only 18 % of American bachelor degree holders reach numeracy level 4 or 5 on the PIAAC Programme for the International Assessment of Adult Competencies) survey compared to the OECD average of 24%. If , however, we look at those who reach levels 3* , 4 or 5, American bachelor degree holders do slightly better with 74 %, compared to the international average of 70%.

Looking at all levels of education, it is noticeable that for numeracy, Americans with less than a high school education do badly compared to their OECD counterparts. Nine per cent have reached level 3, 4 or 5 compared with the OECD average of 24%. This 15 point difference increases to 20 points for high school graduates and falls to 17 points for associate degree holders. For bachelor degree holders, Americans are 4 points ahead of the average and 4 points behind for graduate and professional degree holders.

For literacy, America universities are average or slightly better than average. American associate and bachelor degree holders have the same percentage reaching level 4 or 5 -- 14% and 24%  -- and graduate and professional schools are slightly ahead -- 33% compared to 32%.

For problem solving in technology-rich environments, Americans lag behind at all levels but the gap between Americans and the OECD average gradually diminishes from 10 points for those with less than a high school education and high school graduates to 6 for those with an associates degree, 4 for those with a bachelor's degree and 3 for those with graduate or professional degrees.

It seems unfair  to blame American universities for the limitations of those who have never entered a university or even completed high school.

There is nothing in the PIAAC to suggest that American universities are currently performing worse than the rest of the OECD. They are however lagging behind Japan, Korea, and greater China and this is beginning to be confirmed by the international university rankings.

In any case it is very debatable whether there is anything universities can do override the remorseless effects of  demography, social change, immigration and the levelling of primary and secondary education that are steadily eroding the cognitive abilities of the American population.

Even more striking, differences between the US and the rest of the developed world are relatively modest compared with those within the country.

Only 16% of White Americans are at level 4 or 5 for literacy but that is much better than the 3% of  Blacks and Hispanics. For numeracy the numbers are 12%, 1% and 2% and for problem solving 8%, 2% and 2%..

US universities are probably on average as good as the rest of the OECD although it could be argued that the advantages of language and money ought to make them much better. But they cannot be held responsible for the general mental abilities of the whole population. That is more much more dependent on demography, migration and social policy.

It is likely that as the assault on competition and selection spreads from primary and secondary schools into the tertiary sector, American universities will slowly decline especially in relation to Northeast Asia.




* Level 3: "Tasks at this level require the application of number sense and spatial sense; recognising and working with mathematical relationships, patterns, and proportions expressed in verbal or numerical form; and interpreting data and statistics in texts, tables and graphs."
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Tuesday, 15 Jul 2014 15:08
Thomson Reuters have published another document, The World's Most Influential Scientific Minds, which contains the most highly cited researchers for the period 2002-13. This one includes only the primary affiliation of the researchers, not the secondary ones. If the Shanghai ARWU rankings, due in August, use this list rather than the one published previously, they will save themselves a lot of embarrassment.

Over at arxiv, Lutz Bornmann and Johann Bauer have produced a ranking of the leading institutions according to the number of highly cited researchers' primary affiliation. Here are their top ten universities, with government agencies and independent research centres omitted.

1.  University of California (all campuses)
2.  Harvard
3.  Stanford
4.  University of Texas (all campuses)
5.  University of Oxford
6.  Duke University
7.  MIT
8.  University of Michigan (all campuses)
9.  Northwestern University 
10. Princeton

Compared to the old list, used for the Highly Cited indicator in the first Shanghai rankings in 2003, Oxford and Northwestern are doing better and MIT and Princeton somewhat worse.

Bornmann and Bauer have also ranked universities according to the number of primary and secondary affiliations,counting each recorded affiliation as a fraction). The top ten are:

1.  University of California (all campuses)
2.  Harvard
3.  King Abdulaziz University, Jeddah, Saudi Arabia
4.  Stanford
5.  University of Texas 
6.  MIT
7.  Oxford
8.  University of Michigan
9.  University of Washington
10.  Duke

The paper concludes:

"To counteract attempts at manipulation, ARWU should only consider primary 

institutions of highly cited researchers. "



Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
The Noise   New window
Date: Monday, 14 Jul 2014 20:50
An article in Research in Higher Education by Shari Gnolek, Vincenzo Falciano and Ralph Kuncl discusses what is required for a university in the mid-30s like Rochester to break into the top 20 of the US News & World Report's  (USNWR) America's Best Colleges. The answer, briefly and bluntly, is a lot more than Rochester is ever going to have.

Universities in Russia, India, Pakistan, Malaysia, Indonesia and other places often promise that one day they will be in  the top 100 or 50 or 10 of one of the international rankings. It would be interesting to see how much money they propose to spend.

An intriguing  aspect of this paper is the concept of noise. The authors find that the USNWR rankings show a lot of volatility with universities bouncing up and down for no particular reason and that any change of four places or less can be regarded as a random fluctuation that should not give journalists or university administrators a heart attack or send them strutting around the campus.


One of the authors, a former vice provost at Johns Hopkins, told interviewers:

' “the trustees would go bananas” when Johns Hopkins dropped in the rankings. The administration would then have to explain what had happened.
“Every year Hopkins went from 15 to 16 to 15 to 16 – and I thought, ‘What a silly waste of energy,' ” Kuncl said in an interview Monday. (Johns Hopkins is currently No. 12.)
The paper found that small movements up or down in the rankings are more or less irrelevant. For most universities in the top 40, any movement of two spots or less should be considered noise, the paper said. For colleges outside the top 40, moves up or down of four spots should be thought of as noise, too.'

The amount of noise generated by a ranking is probably a good negative indicator of its reliability. We are after all dealing with institutions that receive millions of dollars in funds, produce thousands of papers and tens of thousands of citations, enroll thousands of students, graduate some of them, employer hundreds of bureaucrats and faculty and adjuncts and so on. We should not expect massive fluctuations from year to year.

I have calculated the average movement up or down the top 100 in Shanghai Jiao Tong University's Academic Ranking of World Universities  between  2011 and 12 and between 2012 and 2013 and the Times Higher Education (THE) and Quacquarelli Symonds (QS) World University Rankings for 2012 -2013. A university falling out of the top 100 altogether was counted as falling to 101st place.

In 2013 ARWU had  a problem with Thomson Reuters who were supposed to be preparing a new list of highly cited researchers so they simply recycled the scores for that indicator from the 2012 rankings. This reduced the volatility of the rankings somewhat so changes between 2011 and 2012 were also analysed.

Starting with the top 20 of ARWU between 2012 and 13 there was an average change of 0.25 place changes. Between 2011 and 12 there were 0.15. 

Between 2011 and 2012 the University of California San Francisco fell from 17th to 18th place, Johns Hopkins rose from 18th to 17th and Tokyo University fell from 20th to 21st. There were no other changes in the top twenty.


Moving on to the QS top 20 in 2012-2013 there were some significant changes, including Stanford rising from 15th to 7th.and the University of Michigan falling from 17th to 22nd. The average change was 1.7 places. 


At the top, the THE world rankings were somewhat less volatile than QS but much more so than ARWU. The average change was 1.2 and the biggest was University College London which fell from 17th to 21st.

So university administrators should be concerned about any change within  the top twenty of the ARWU but should not be bothered about a change of one or two places in the THE or QS rankings.

Moving on to the top 100, the average change in the ARWU was 1.66 places between 2012 and 2013 and 2.01 between 2011 and 2012. 

The biggest change between 2011 and 2012 was Goettingen which fell from 86th to the 101 -150th band.

In the QS rankings between 2012 and 2013 the average change in the top 100 was 3.97 Substantial changes include Boston University which fell from 64th to 79th and the University of Birmingham which rose from 77th to 62nd.


In the THE  top 100  the average change was 5.36 . Notable changes include Lund University falling from 82nd to 123rd, Montreal falling from 84th to 106th and  King's College London rising from 57th to 38th.


So, it can be concluded that the ARWU rankings are the most reliable. For the top 100, QS is more reliable than THE but the reverse is the case for the top 20.

How to explain these differences?  To be certain it would be necessary to look at the separate indicators in the three rankings but here are some thoughts.

The dominant indicator in the QS rankings is the academic survey which has a weighting of 40 %. Any fluctuation in the survey could have a disproportionate effect on the overall score. The most important single indicator in the THE rankings is Citations: Research Influence, which has a 30 %  weighting but contributes a higher proportion to total scores because the regional adjustment gives an extra boost to countries with a limited research base. In contrast, no indicator in the Shanghai rankings has more than a 20 % weighting.

The THE rankings include inputs such as income. An injection of research funds from a corporation would immediately  improve a university's position in the income from industry, research income and total institutional income indicators. It would be a few years before the funds produce an improvement, if they ever did, in the publications indicator in ARWU and even longer in the Citations per Faculty indicator in the QS rankings.

ARWU uses publicly available data that can be easily checked and is unlikely to fluctuate very much from year to year. THE and QS also use data submitted from institutions. There is room for error as data flows from branch campuses and research centres to the central administrators and then to the rankers. QS also has the option of replacing institutional data with that from third party sources.

So everybody should relax when reading this year's rankings. Unless your university has risen or fallen by more than two spots in ARWU, four in the QS rankings or six in THE's.





Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Tuesday, 01 Jul 2014 15:57
From University World News

"Tens of thousands of foreign students with invalid language test scores have been exposed in a groundbreaking investigation in Britain, while 57 private further education colleges have been stripped of their licences. Three universities have been prohibited from sponsoring new international students pending further investigations, and some 750 bogus colleges have been removed from the list of those entitled to bring foreign students to Britain."

The three universities are West London, Bedfordshire and Glyndwr. 

The proportion of students who are international accounts for 5 % of the total weighting of the QS World University Rankings and 2.5 % of Times Higher Education's.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Monday, 30 Jun 2014 13:53
Times Higher Education Asian University Rankings

Source

Scope
Universities in Asia, including Southwest Asia, Turkey and Central Asia but not Australasia or the Pacific region.

Methodology
Unchanged since last year. Same as the THE World University Rankings

Teaching: the Learning Environment 30%  (includes 5 indicators)
Research: Volume, Income, Reputation  30% (includes 3 indicators)
International Outlook 7.5% (includes 3 indicators)
Industry Income: Innovation 2.5%
Citations: Research Influence 30%

Top Ten

1.   University of Tokyo
2.   National University of Singapore
3.   University of Hong Kong
4.   Seoul National University
5.   Peking University
6.   Tsinghua University
7.   Kyoto University
8.   Korea Advanced Institute of Science and Technology
9=  Hong Kong University of Science and  Technology
9=  Pohang University of Science and Technology

Countries with Universities in the Top Hundred

Japan              20
China              18
Korea             14
Taiwan            13
India               10
Hong Kong      6
Turkey             5
Israel                3
Iran                  3
Saudi Arabia    3
Thailand           2
Singapore         2
Lebanon           1

Selected Changes

Hebrew University of Jerusalem down from 15th to 18th
Bogazici University, Turkey, up from 37th to 19th
Sogang University, Korea, down from 78th to 92nd
Panjab University, India,  from unranked to 32nd.
Keio University down from 53rd to 72nd
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Thursday, 26 Jun 2014 18:23
The number and frequency of international university rankings is constantly increasing so I am starting to use a standard format.

QS BRICS Rankings

Source

Scope
Universities in Brazil, Russia, India, China, South Africa. Does not include Hong Kong, Macau or Taiwan.

Methodology
Unchanged since last year.

Academic Reputation 30%
Employer Reputation  20%
Faculty/Student Ratio 20%
Staff with a PhD 10%
Papers per Faculty 10%
Citations per Paper 5%
International Faculty 2.5%
International Students 2.5%

Top Ten

1.   Tsinghua University
2.   Peking University
3.   Lomonosov Moscow State University
4.   University of Science and Technology China
5.   Fudan University
6.   Nanjing University
7.   Universidade de Sao Paulo
8.   Shanghai Jiao Tong University
9=  Universidade Estadual de Campinas
9=  University of Cape Town

Countries with Universities in the Top Hundred

China               40
Brazil               19
Russia              18
India                15
South Africa     8

Selected Significant Changes

Harbin Institute of Technology down from 23rd to 27th
Wuhan University down from 26th to 33rd
Tomsk State University up from 58th to 47th
Manipal Academy of Higher Education up from 100th to 85th.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Wednesday, 25 Jun 2014 13:54
This is from the Independent, which is supposed to be a  serious newspaper, 18 June.

"British cities seeking to adapt to the realities of the new global economy should model their plans on the success of United States conurbations including Detroit, a former urban development advisor to President Obama has told The Independent. ...

But Bruce Katz, vice president of the Washington think tank the Brookings Institution, who has advised both the Clinton and Obama White Houses on urban regeneration, said that Detroit was now part of the metro revolution that is transforming the lives of millions of citizens and rebuilding the shattered US economy."


Five days later in the Independent.

"Activists angered by the closing of water accounts for thousands of people behind in their payments have taken their fight to the United Nations.

In March, the Detroit Water and Sewerage Department (DWSD) announced that it would start cutting off the services of homes, schools and businesses that were at least 60 days overdue or more than $150 behind.

It said it wanted to start recouping $118million owed from unpaid bills and that a fierce approach was needed to coax money from delinquent accounts, which make up almost half of the city’s total.

The move, in which as many as 3,000 properties were expected to be cut off each week, has outraged campaigners."


Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Monday, 23 Jun 2014 20:16
The truth about Panjab University's (PU) rise in the Times Higher Education World University Rankings -- and no other -- is revealed in the Times of India.

Shimona Kanwar notes:

"The paper on the discovery of Higgs boson particle better known as the God particle, which earned the Nobel Prize in Physics last year, has come as blessing for Panjab University (PU). PU earned an overall score of 40.2, most of which has been contribution of citations from the university's publications. The paper on the God particle had 10,000 citations, which helped immensely give the numero uno status to PU in the country.
The Times Higher Education Asia University ranking-2014 had four parameters -teaching, international outlook, industry income, research and citations. Out of the 30% score on citations, 84.7 was the top score, which gave the university an edge over the other 21 participating universities. This included Jawaharlal Nehru University, Delhi University, Aligarh Muslim University and IIT Kharagpur among others. Though the CERN project which was associated with the discovery of the God particle involved participation from Delhi University as well, a huge number of PhD students in the project from PU apparently contributed in this rank."We had build parts of a detector, contributed for the hardware, software and physics analysis in the Compact Muon Solenoid (CMS) stage of the God particle discovery," said Prof Manjit Kaur of PU, who was part of the project.
Panjab University had 12-15 PhD students and five faculty members from the department of physics who worked in collaboration for the prestigious project."

A couple of things are missing though. Delhi University (DU) also joined the project but did not even get into the top 100 of the Asian rankings. How come? It wasn't those doctoral students. It was probably (we can't be certain without seeing the scores for all the indicators) because although PU had fewer citations than DU over the relevant period it also had significantly  fewer papers to divide them by.

The trick to getting on in the THE rankings is not just to get lots of citations in the right field and the right year and the right country but also to make sure the total number of papers doesn't get too high.

And, as I noted yesterday, if TR, THE's data collectors, do what they have done for the Highly Cited researchers database and stop counting physics publications with more than 30 affiliations, then PU will almost certainly fall out of the rankings altogether.

Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Wednesday, 18 Jun 2014 09:01
Times Higher Education has a piece about the most highly educated cities in the world. First, of course, is London, followed by Paris, Los Angeles, San Francisco (presumably including Berkeley) and Stockholm. The data comes from a report by PricewaterhouseCoopers, the international financial services company, which includes information about the percentage of the population with degrees and the ranking of universities in the city by (surprise!) Times Higher Education.

Boston is not in the top ten because it was not evaluated by PricewaterhouseCoopers.

Note that the  rankings indicator is based only on those that actually take part in the THE rankings. So London's score would not be affected by places like London Metropolitan University or the University of East London.

Looking at the PricewaterhouseCoopers report, the most important indicator might be the PISA scores, which suggest that the future belongs not to London or Paris but to Shanghai.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Saturday, 07 Jun 2014 08:10
Harvey Mudd College is a very expensive highly ranked private school with a strong emphasis on the teaching of engineering and technology. The US News and World Report 2013 rankings have it in 12th place among national liberal arts colleges, second for master's engineering schools and fourth for computer engineering. Even so, it seems that some feel that it is a failure because it is not getting enough women to take courses in key disciplines such as computer science.

The new president, Maria Klawe, is taking care of that.

The new introductory class for Computer Science at Harvey Mudd is designed for those who did not go to computer camp in high school and is supposed to be interesting. Students edit Darth Vader's voice and on one test the answer to every question is 42 ( guess what the media would say if that happened in an underachieving inner city high school). If you are not amused by the joke about 42 you should forget about going to Harvey Mudd.

The course used to be about programming and was dominated by "geeky know it alls" who have now been told to mind their manners and shut up. Programming in Java has been replaced by Python.

"It was so much fun; it was so much fun" said one student.

Also, all female first year students go to attend a conference on women in computing .

And so, at Harvey Mudd 40% of computer science majors are now women. Bridgette Eichelberger switched from engineering to computer science because the fun of engineering was nothing compared to the happiness of computer science.

Meanwhile over at Berkeley, the introductory computer science course is now called the "Beauty and Joy of Computing".

Someday universities in Shanghai, Seoul and Taipei may start turning their faculties of science and engineering into places where the daughters of the 1%, or maybe the 5%, can find fun and happiness and from which  repellent geeks and nerds have been cleansed. Until that happens the universities and corporations of the US have cause to be very very afraid.






Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Thursday, 05 Jun 2014 19:56
And finally the results are out. The world's leading thinker, according to a poll conducted by Prospect magazine, is the economist Armatya Sen, followed by Raghuram Rajan, Governor of the Reserve Bank of India, and the novelist Arundhati Roy.

Sen received degrees from the the University of Calcutta and Cambridge and has taught at Jadavpur University, LSE, Oxford, Cambridge and Harvard. Rajan has degrees from IIT Delhi, the Indian Institute of Management Ahmedabad and MIT. Roy studied architecture at the School of Architecture and Planning in Delhi.

The careers of Sen and Ragan illustrate a typical feature of Indian higher education, some excellent undergraduate teaching but somehow the outstanding students end up leaving India.

Prospect notes that the poll received "intense media interest in India" so it would be premature to conclude that the country has become the new global Athens.

The top non-Asian thinker is Pope Francis.

Personally, I am disappointed that the historian Perry Anderson only got 28th place.  I am also surprised that feminist and queer theorist Judith Butler, whose brilliant satire -- Hamas as part of the global left and so on -- is under-appreciated, was only 21st.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Date: Tuesday, 03 Jun 2014 11:55
Sunday's New York Times had twoarticles on international rankings. The first, by D. D. Guttenplan, is 'Re-Evaluating the College Rankings Game' and includes interviews with Angela Yung Chi Hou of the Higher Education Evaluation and Accreditation Council of Taiwan, Ellen Hazelkorn and myself.

The second by Aisha Labi is about the recently published U-Multirank rankings which are sponsored by the European Union.
Author: "Richard Holmes (noreply@blogger.com)"
Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader