Showing posts with label SCImago. Show all posts
Showing posts with label SCImago. Show all posts

12 August 2008

Citations and performance evaluation

Gerry McKiernan has drawn attention on the BOAI discussion list to a special issue of Ethics in Science and Environmental Politics, devoted to 'The use and misuse of bibliometric indices in evaluating scholarly performance'. All of the papers are open access and the authors include Philip Campbell, Editor in Chief of Nature on escaping from the Impact Factor and concluding, Although the current system may be effective at measuring merit on national and institutional scales, the most effective and fair analysis of a person’s contribution derives from a direct assessment of individual papers, regardless of where they were published; Peter Lawrence on how measurement harms science; Anne-Wil K. Harzing and Ron van der Wal on Google Scholar as a new source for citation analysis; and Stevan Harnad on Validating research performance metrics against peer rankings. There are fourteen papers in all including the introduction to the issue and all are worth at least dipping into.

The Harzing and van der Wal paper is likely to be somewhat contentious, since other research into the use of Google Scholar has downplayed its value. I first raised the question of using Google in performance measurement in a message to the JESSE mailing list in 2002. I commented that:
My most cited paper is "On user studies and information needs" (1981) - a Web search (using Google) revealed 118 pages that listed the title. The pages were reading lists, free electronic journals, and documents that would never be covered by SSCI, such as reports from various agencies. SSCI revealed, if I recall aright, 79 citations of the paper. The question is: is the Web revealing impact more effectively than SSCI? Citation in scholarly papers takes a variety of forms and much citation is of a token variety - x is cited because x is always cited. On the other hand citation on reading lists implies some positive recommendation of the text, and mention in policy documents and the like, implies (at least in some cases) that some benefit has been found in the cited document.

This led to an extended and interesting discussion. It was followed by a paper on the subject in JASIST by Vaughan and Shaw (Volume 54, Issue 14, 1313-1322) and by a comparison of Google Scholar and Web of Science by Peter Jasco in Current Science, v.89, no. 9, 1537-1547 - incidentally, Harzing and van der Wal reply to some of Jasco's criticisms of Scholar.

All of this suggests that interest in the application of 'metrics' in performance measurement is growing and, perhaps, there is also a growing awareness that the impact factor provided by Web of Knowledge is not necessarily the only tool that can be used. We have also seen the emergence of the Scimago analysis of impact, which I have referred to earlier and these developments, as well as providing for a new academic industry of papers on Web citation, are likely to bring about a re-assessment of metrics for performance measurement and, perhaps, greater wariness about their use than is currently shown by some administrators.

19 January 2008

SCImago journal ranking - again

Wouter Gerritsma has picked up on the existence of the SCImago journal ranking system (based on SCOPUS) and has referenced a couple of earlier posts on this Weblog. He compares the SJR with the Web of Knowledge JIF and discovers that they are quite closely correlated. I wonder if this is not altogether to be expected? The JIF is based on citations to papers in the journal, while the SJR is based on something similar to the Google page rank algorithm - page rank is based on links to sites by other sites, which is, in itself, a form of citing. Wouldn't we expect the two modes to be closely aligned?

23 December 2007

The "SCImago Influence Measure"

I mentioned the new SCImago journal ranking site a little while back and thought I would explore it a little further. In doing so, I find that the "Cites per doc" measure, which is given for one, two, three and four year periods might be called the 'SCImago Influence Measure' or 'SIM', since it is more or less equivalent to the Web of Knowledge Impact Factor. I prefer 'influence' to 'impact', since the latter is rather macho and percussive, while the former is much more subtle and, I think, more appropriate, since what we are talking about is the influence that a journal has within its field.

The four-year SIM is particularly interesting, I think, since it allows for a much longer period of time within which the documents have a possibility of being cited. Using the SCImago database to download the data also gives the opportunity for producing some interesting comparisons. The graph below shows the four-year SIM for a long-established journal, the Journal of Documentation, compared with three, now established, open access journals - Information Research, the Journal of Digital Information and the Journal of Electronic Publishing. It is striking that on this measure all three OA journals are now approaching the same level of 'influence' as the older journal. JEP has had some problems in maintaining publication, hence the dip in 2006, but with its future now established (I believe), I imagine that the growth in its influence will resume.

11 December 2007

SCImago Journal and Country Rank

News of a new journal ranking site from the SCImago research group at the University of Granada. Described as follows:
The SCImago Journal & Country Rank is a portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database (Elsevier B.V.). These indicators could be used to assess and analyze scientific domains.

This platform takes its name from the SCImago Journal Rank (SJR) indicatorpdf, developed by SCImago from the widely known algorithm Google PageRank™. This indicator shows the visibility of the journals contained in the Scopus® database from 1996.

A natural question for me, then, is: How does Information Research show up in this new ranking? So, I took the journals that are similar to Information Research, in that they are not 'niche' journals, but publish widely across information science, information management, librarianship, etc., from ISI's Journal Citation Reports and then gathered the data from SCImago. To reduce the effort of creating a table (not as easy in Blogger as it is in Free-Conversant) I have taken the top 10 journals from the list:

Journal                h-index SJR   cites/doc     JIF

Info & Mgt             29     0.069    3.65     2.119

Journal of ASIST       27     0.068    2.48     1.555

Info Pro & Mgt         27     0.058    2.11     1.546

J of Doc               23     0.058    1.61     1.439

Info Research          12     0.053    1.77     0.870

Lib & Info Sci Res     14     0.053    1.26     1.059

Int J Info Mgt         18     0.051    1.55     0.754

Lib Qly                14     0.051    1.23     0.528

J Info Sci             17     0.051    1.01     0.852

Lib Trends             14     0.050    0.85     0.545


The use of the h-index is well known in the bibliometrics fraternity and is normally used to measure the productivity and impact of an individual scholar. One of its problems, particularly significant in ranking journals, is that the longer the period in which the scholar (journal) has been active, the more likely it is that the scholar (journal) will receive a high h-index, so it's usefulness here may be limited. However, it is interesting to see that Information Research has an h-index of 12, while older journals have lower measures.

The SJR measure is explained as,
...an indicator that expresses the number of connections that a journal receives through the citation of its documents divided between the total of documents published in the year selected by the publication, weighted according to the amount of incoming and outgoing connections of the sources.

The 'cites/doc' measure is based the number of citations received in the previous four years and the total number of documents published in 2006.

JIF is the ISI Journal Impact Factor.