It's been a little while since I updated my page of links to material on Activity Theory, but now it is done and those interested will find it online some time after 22:30 GMT tonight. Very little has changed - two of the links returned error messages, but the material was still there, with newe URLs, and there is one additional link. I also checked the page of links to material on Vygotsky - that is still there, but the last two items in the list return 404 errors. If you come across other pages that you think may be of interest, please send me a link.
a spin-off from the e-journal dedicated to informal publication of ideas and comment on current affairs in the information world — and occasional personal posts.
29 April 2011
03 April 2011
Impact factors and true 'influence'
I've never been a particular fan of bibliometrics, although many years ago (40 to be exact!) I taught a course in the subject. There are uses for the methods, such as determining the journals one might subscribe to when setting up a new library service - although making the rounds of the clients and seeking their advice probably serves as well - but generally, it seems that many bibliometric studies are carried out simply as an exercise in the methods, or to refine them, without much being said about the practical applications of the methods.
I was pleased, therefore, to come across a paper that appears to have something useful to say about 'impact factors' - drawing attention to some of the problems and proposing an alternative method of determining 'impact', or, as I would prefer 'influence'. 'Impact' is one of those macho, aggressive words, chosen, it seems, to impress, whereas what one is talking about is the influence of a journal within a scholarly field.
The paper is Integrated Impact Indicators (I3) compared with Impact Factors (IFs): an alternative research design with policy implications, by Loet Leydesdorff and Lutz Bornmann. The authors argue that the journal impact factor is flawed as a consequence of being based on the two-year average of citations, whereas a 'true' indicator of 'impact' would be based on the sums of the citations. This is argued on the analogy that the impact force of two bodies is the sum of their mass times the velocity of impact (if I have understood things aright, not being a statistician!). On this basis, the total number of citations (the analogy of 'velocity') needs to be taken into account.
The authors use the LIS category of Web of Knowledge, showing that on the basis of 'summed impact', JASIST is ranked ahead of MIS Quarterly, rather than behind it. However, this gives rise to another problem: how to categorise the journals in the first place. MIS Quarterly's primary classification in Web of Knowledge is in the information systems category and it is something of a mystery as to why it appears in the LIS classification at all. Inevitably, then, a core journal like JASIST, must appear ahead of one that is misplaced in the classification scheme. This is not to dispute the argument of Leydesdorff and Bornmann, I simply raise the issue.
The LIS category in Web of Knowledge is a complete mess, with journals having a secondary home there and others placed there, seemingly because there was nowhere else to put them, such as The Scientist, which, in any event, is a kind of news magazine, rather than a scholarly journal. Other examples of journals in the LIS list that have their primary location somewhere else include, International Journal of Computer-Supported Collaborative Learning, Information Systems Research, Journal of Computer-Mediated Communication, Journal of Information Technology, and too many more to list. Anyone wishing to compare rankings (arrived at by any means) would need to clean up the list on the basis of which journals LIS scholars are likely to seek publication in - I suspect that, instead of there being 66 journals to consider, one would probably have something like half that number. Urging LIS researchers to publish in MIS Quarterly or Information Systems Research is a completely pointless exercise, since their papers are likely to be ruled, "out of scope". This of importance when it comes to journal ranking, since, using the 5-year impact factor, out of the top 20 journals, I would argue that seven are 'out of scope'.
Another point that might be addressed is that of the general versus the specific. We might expect that a specialist journal, catering for a well-developed area of research, will have higher 'impact' than a more general purpose journal. Thus, in the top 20 journals, ranked by 5-year impact factor, six are what I would call 'niche' journals, such as the International Journal of Geographical Information Science and Scientometrics. If, as is increasingly the case, researchers are not simply urged, but required, to publish in the top-ranked journals, this leaves (excluding the out of scope) not 20, but seven, 'general purpose' journals in which to seek publication - or, since ARIST is now no longer published, six. That is, Information Management; JASIST; Information Processing and Management; Journal of Information Science; Journal of Documentation; Library and Information Science Research.
If we want to increase that to the "top ten" general purpose LIS journals, we would include: International Journal of Information Management; Information Research; Library Quarterly; Journal of Library and Information Science, which takes us down to number 34 in the WoK rankings.
All of which serves to demonstrate that lists and rankings are treacherous things that are probably best avoided. :-)
Subscribe to:
Posts (Atom)