The most prestigious peer-reviewed journals in the world, such as Cell, Nature, Science, and the Journal of the American Medical Association (JAMA), have less and less influence amongst scientists, according to a paper co-authored by Vincent Larivi?re, a professor at the University of Montreal's School of Library and Information Sciences. He questions the relationship between journal "impact factor" and number of citations subsequently received by papers. "In 1990, 45% of the top 5% most cited articles were published in the top 5% highest impact factor journals. In 2009, this rate was only 36%," Larivi?re said. "This means that the most cited articles are published less exclusively in high impact factor journals." The proportion of these articles published in major scholarly journals has sharply declined over the last twenty years. His study was based on a sample of more than 820 million citations and 25 million articles published between 1902 and 2009. The findings were published in the Journal of the American Society for Information Science and Technology.
For each year analysed in the study, Larivi?re evaluated the strength of the relationship between article citations in the two years following publication against the journal impact factor. Then, he compared the proportion of the most cited articles published in the highest impact factor journals. "Using various measures, the goal was to see whether the 'predictive' power of impact factor on citations received by articles has changed over the years," Larivi?re said. "From 1902 to 1990, major findings were reported in the most prominent journals," notes Larivi?re. But this relationship is less true today."
Larivi?re and his colleagues George Lozano and Yves Gingras of UQAM's Observatoire des sciences et des technologies also found that the decline in high impact factor journals began in the early 90s, when the Internet experienced rapid growth within the scientific community. "Digital technology has changed the way researchers are informed about scientific texts. Historically, we all subscribed to paper journals. Periodicals were the main source for articles, and we didn't have to look outside the major journals," Larivi?re noted. "Since the advent of Google Scholar, for example, the process of searching information has completely changed. Search engines provide access to all articles, whether or not they are published in prestigious journals."
Impact factor as a measure of a journal's influence was developed in the 1960s by Eugene Garfield, one of the founders of bibliometrics. "It is basically the average number of times a journal's articles are cited over a two-year period," Larivi?re explained. "Initially, this indicator was used to help libraries decide which journals to subscribe to. But over time, it began to be used to evaluate researchers and determine the value of their publications." The importance of impact factor is so ingrained in academia's collective consciousness that researchers themselves use impact factor to decide which journals they will submit their articles to.
Various experts in bibliometrics have criticized the use of impact factor as a measure of an academic journal's visibility. A common criticism is that the indicator contains a calculation error. "Citations from all types of documents published by journal are counted," Larivi?re said, "but they are divided only by the number of articles and research notes. Impact factor is thus overestimated for journals that publish a good deal of editorials, letters to the editor, and science news, such as Science and Nature."
Another criticism is that the time frame in which citations are counted in calculating impact factor is too short. "There are research areas in which knowledge dissemination is faster than it is in others," Larivi?re said. "We cannot, for example, expect to get the same kind of impact factor in engineering and biomedical sciences." Yet journal impact factor is established in the two-year period following publication of articles regardless of the discipline.
The research results reveal some interesting points. On the one hand, journals are increasingly poor predictors of the number of citations an article can expect to receive. "Not only has the predictive power of impact factor declined, but also, impact factor is no longer suitable for evaluating research," Larivi?re argued. In his opinion, if we want to evaluate researchers and their work, it is best to use citations, which are a true measure of an article's impact. "This indicator is more accurate. It is not an estimation based on the hierarchy of journals." On the other hand, his work confirms that the dynamics of scholarly journals is changing, due especially to the open access of knowledge made possible by the Internet. "What then is the present function of scholarly journals?" Larivi?re asked. "One remains: peer review."
###
University of Montreal: http://bit.ly/mNqklw
Thanks to University of Montreal for this article.
This press release was posted to serve as a topic for discussion. Please comment below. We try our best to only post press releases that are associated with peer reviewed scientific literature. Critical discussions of the research are appreciated. If you need help finding a link to the original article, please contact us on twitter or via e-mail.
This press release has been viewed 11 time(s).
Source: http://www.labspaces.net/125108/Study_reveals_declining_influence_of_high_impact_factor_journals
shld 2012 sec football schedule medifast sinead oconnor braylon edwards jimmer fredette mall of america
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.