12-31-2012 05:00 PM - edited 01-24-2013 02:23 PM
economics & business
pharmacology & toxicology
psychiatry & psychology
molecular biology & genetics
neuroscience & behaviour
social sciences, general
plant & animal science
biology & biochemistry
environment & ecology
02-11-2013 07:09 AM - edited 02-11-2013 07:11 AM
There are methodological differences between the processes used to produce these lists www.highlycited.com/methodology/, and those used to produce highly cited researcher lists that appeared at isihighlycited.com. As a result, there are some researchers who appear on one list and not the other. Thomson Reuters is committed to developing and refining our process of identifying leading research and researchers. While we no longer follow the processes used on ISIHighlyCited.com to identify highly cited authors, we consider individuals appearing on either list highly cited researchers.
Our Highly Cited Research team have provided additional clarification for the new methodology here:
For your reference here is a link to the original methodology:
To confirm Thomson Reuters considers individuals from both lists as highly cited. Once someone has been selected as highly cited, they are highly cited; changing the selection criteria going forward does not strip a researcher of the highly cited designation conferred at any point in the past.
If you have any further questions please submit a request using our Technical Support Webform below:
02-26-2013 12:03 PM
Lost in Citation KS Novosoleov and A.K Geim , from the University of Manchester, arguably among the most influential physicist alive, have coauthored the two most cited papers worldwide in Physics in the period 2001-2010. Besides, the seventh most cited paper in the same period was also theirs. The Nobel Prize was awarded to these two researchers in 2011, but, apparently, they do not deserve to be neither on the old list of Highly Cited Researchers (HCR) up to 2012, nor on the new HCR list released by Thomson Reuters in December 2012. Combine this fact with the policy of the new list makers of not distinguishing among first and/or corresponding author and the other contributors, which introduces a noticeable bias towards large collaborating teams (particularly severe in disciplines where huge international consortia appear as the authors of scientific papers), and the effects are shocking. Just to give an idea of the potentially lethal effect on the credibility of Thomson Reuters when nominating citation celebrities, let us compare the performance of the two authors from Manchester with that of the STAR Collaboration, a powerful consortium in Particle Physics. A quick look at the Web of Knowledge shows that KS Novosoleov and A.K Geim have jointly authored in the decade 2001-2001 a total of seven Physics papers with more citations than any paper written by the Star collaboration. Besides, they wrote in the same period another set of nine Physics papers with more citations than any paper written by the Star collaboration but one. Astonishingly, 145 members of the Star collaboration (whose papers show a great deal of self-citation given the large number of authors) are on the list of highly cited authors in Physics (more than 40% of the authors in the Physics list of highly cited authors), but the two Nobel Prize winners are not.
"Thomson Reuters recognizes the potential dangers associated with massively multi-authored papers, but leave for future HCR nominations the evaluation of the methodology in relation with group authorship papers, which are currently included in HCR analysis".
When looking for the rationale behind such a blunder by Thomson Reuters, only one factor arises as the most likely guilty party: KS Novosoleov and A.K Geim regularly publish their findings in Nature, The Proceedings of the National Academy and Science, among other high impact journals. The Physics papers (Physics is what those two researchers do, as far as the Nobel Prize Committee is concerned) that appear in any of the journals mentioned before are never classified within the Physics Research Area (SU in the Thomson Reuters searching engine): they fall in the area of Science and Technology: other topics. THOMSON acknowledges this drawback, but associates it with the difficulties arising when trying to classify research that crosses disciplinary, "... it is possible that an individual might produce a number of papers that are individually highly-cited but, because they are in journals assigned to different broad areas (such as Chemistry and Biochemistry, or Environment and Engineering) they do not appear as a substantive cluster within any one ESI category. We need to overcome this division of data if it would cause us to miss individual Highly-Cited Researchers".
However, we are talking here about researchers that publish papers well within the boundaries of a discipline. They just made a mistake, to publish those results in high impact journals classified by Thomson as belonging to the broadest multidisciplinary journals category: Science and Technology: other topics.
That also happens to papers in other disciplines: if they are published in those journals, they will be lost in citation.
Domingo Docampo, February 26, 2013
03-21-2013 04:21 PM
Regarding the Dec 2012 lists, the problems that Domingo Decampo identifies for the physics domain also apply just as strongly to the ecology/environment domain, I suspect for the same or similar reasons. Many of the world's most influential ecologists and most of those that have been rated as amongst the world's 20 most cited ecologists using other algorithms are apparently not among the world's 239 most cited ecologists according to this list. I would encourage Thompson Reuters to reconsider and reevaluate their criteria and methodology used in costructing this list, because based on the names that are conspicuously absent from this list, it will not be seen as credible by anyone who knows much about who is doing what in the field of ecology.
03-28-2013 11:57 AM - edited 03-28-2013 12:03 PM
Dear Dddocampo and Daw
We value our customers feedback.
Could you please submit a technical support ticket to us so that we can respond to your feedback. You can send your contact information and description of your feedback by opening a technical support eticket here:
03-28-2013 02:38 PM
In 2012, Thomson Reuters consulted with the research community on a new methodology for identification of Highly Cited Researchers. The results of that consultation and methodology were published in December 2012 and January 2013 [link]. The project team has received a lot of feedback from the scholarly community about our results. Researchers, bibliometricians, and administrators pointed out that these lists omitted too many prominent scientists to represent a robust listing of leading researchers. We agree.
In our attempt to normalize actual citations counts to papers at the specialty level (individual Web of Science subject categories), and then to compare these records and identify researchers of multiple highly cited papers at an aggregated level, we arrived at an over-representation of relatively highly cited publications and researchers. By extracting the top 1% of papers at the aggregated level according to normalized citation scores, publications with lower relative scores but considerably higher absolute citation counts were pushed out of the collection, and so were their authors.
We reran our analysis using a revised procedure and obtained names that we recognized – and that the community will recognize – as leading researchers. Moreover, we compared two approaches: normalization at the Web of Science subject category level and at the level of Essential Science Indicators (ESI) disciplines. We found substantially the same list of names using either approach.
Because the old lists of highly cited researchers (based on total citations and not number of highly cited papers) were organized by ESI disciplines, because the methodology of defining highly cited papers is well described in ESI, familiar to many, and therefore transparent, and because the results of the two methods yielded largely the same list of researchers and because you, the scholarly community, consistently reference ESI in your responses to the new methodology and its results, we have decided to employ the definition of highly cited papers used in ESI to create our new highly cited author listing. The data analyzed will be papers (articles and reviews) recorded by Thomson Reuters from 2002 through 2012 and citations to these papers over the same period. The data surveyed will therefore be the same as those currently appearing in ESI as of March-April 2013. On May 1st, the regular bimonthly update of ESI will shift the time span of coverage from 2003 through February 2013, and the data will change as they do with each bimonthly update.
We are replacing the old preliminary data with new preliminary data in each ESI discipline, using the method described above. More work is required before these lists are made final since author names must be validated and checked for variants (surname and one initial vs. surname and two initials, which represent the same researcher), for homographs (researchers who happen to have the same name form whose work is mistakenly combined in summary counts), and for the most recent affiliation of the individuals identified. This process will be ongoing and our goal is to update our information as we receive feedback from the community.
Thomson Reuters is grateful to members of the research community for their input and advice. Our project team is confident that our new results will be credible and useful but we wish to extend our invitation for comments on our data. No procedure is wholly free from error so we stand ready to address concerns and questions from our partners in the research community.
4 weeks ago
Is it possible to confirm the status of the 2001-2010 Highly Cited Researchers lists? I know that there is discussion about methodology, but will that apply to the next round, or are the 2001-2010 lists being updated a well?