A question which always creates a discussion is whether articles published open access are cited more often than articles published in toll access journals.
It is also a question that has been written quite a bit about. In the previous blog post you could read that JSTOR has over 150 million attempts yearly to download articles that the users do not have access to. There are studies which indicate that open access articles get more citations directly after publication. There are also studies which indicate that there is no big difference between the number of citations between open access articles and toll access articles. It takes some years for a journal to build up its reputation which has made these studies difficult to do. Results in an article from 2006 shows that open access articles are cited quicker after publication. The study shows that a certain period after publication (approx. 200 days) 49 % of toll access articles and 37 % of open access articles had not gotten citations yet. The study also showed that open access articles were used more although researchers had access to the toll access journals through a library. Also, it showed that open access articles which were published in same journals as non-open access articles, received more citations.
Yet another article was published in 2008. It showed that open access articles had on average nine citations while toll access articles had received five. Also, this study shows that there is a difference between disciplines, e.g. sociology had the biggest advantage but the least amount of open access articles.
Even more studies can be found at The Open Citation Project home page.
There are those who mean that it is difficult to prove citation advantage because there is a problem with the methods and definitions which have been used in these studies, i.e. that open access is defined as freely available material and not as publishers’ article processing charge business model. Criticism against the methods have to do with control groups, samples and which conclusions can be drawn. These problems are discussed in Dr Henk Moeds article “Does open access publishing increase citation or download rates”.
There is research which indicates that open access publishing increases citations but there is also studies which indicate the opposite. There are also studies which indicate that making research data available increases citations. At least initially it is better to publish open access because these articles are disseminated quicker. Open access journals have matured now and also, there are other indicators for dissemination and impact due to the development of altmetrics.
The number of electronic journals and especially open access journals is increasing. This means that a growing number of people have access to research articles. New areas develope due to the move towards online journals. One of these areas is altmetrics which has a relation to open access.
Altmetrics measures impact on article level and it is seen as an alternative to traditional indicators measuring article level impact. We have written about altmetrics in this blog before. In short altmetrics means that book marks, links, tweets, Facebook-likes, blog posts etc. are used to indicate the impact a publication has. Altmetrics considers what happens in social media and illustrates the on-going discussion a research publication might have caused. Also, altmetrics shows that researchers are moving their work online and web-based services.
There are benefits and drawbacks with altmetrics. Benefits have to do with that all types of publications are regarded, not just the traditional articles published in traditional journals with high journal impact factor. Also, altmetrics is a quicker way to measure research impact. We do not have to wait for two years or more to be able to say something about the paper’s effect. We can see the effect right away in the discussions and tweets. PLOS Article Level Metrics for Researchers lists, among others, the following benefits:
- Researchers can see and collect feedback on dissemination and impact of the research in real time and researchers may share raw data with collaborators, research administrators and research funders
- Trends in research become visible quicker
- Researchers can find potential collaborators based on the interest in the research
- Other interpretations based on research data, methods and results can be discovered
- Strategies to disseminate and publish research can be followed up and evaluated
- Research results are evaluated according to their content, not according to “container” (published journal) that has been used.
The drawback is that we do not have a clear understanding what the numbers mean. Do they show real impact and dissemination or just a buzz that research results have been able to create?
Another drawback is that the number can be seen as a popularity contest which may lead to manipulation of the numbers. Altmetrics is not subjected to more manipulation than other biblimoteric indicators are. They can also be manipulated. When this happens the numbers lose their value.
As said in the beginning, there is a connection between altmetrics and open access. There are studies which show that scientific articles published open access increases downloading and therefore dissemination of them and therefore altmetrics is an interesting measurement. JSTOR has recorded 150 million attempts to get access to articles they have only toll access to. Yearly. This means that there are articles which do not have the impact that they could have because potential readers do not have access to the articles. There is a citation bias to articles which are freely available and this corresponds with altmetrics. The ten most popular articles 2012 according to altmetrics.com show that seven of them where open access and none of them were published by Nature or Science.
Altmetrics has made us think about the definition of impact. Impact is still, at least in the open access context, unexplored and underdeveloped. One should keep in mind that altmetrics can be used in some social spheres where openness and open standards are used. This suits open access well and not so well the traditional publishing model.
 Davis, P. M., Lewenstein, B. V., Simon, D. H., Booth, J. G., & Connolly, M. J. L. (2008). Open access publishing, article downloads, and citations: Randomised controlled trial. BMJ 337:a568.
The theme for this year’s open access week is to redefine impact – how we measure and evaluate scientific results and its impact. Bibliometric indicators through citations have been used to measure the effect of scientific production. Altmetrics is a relatively new way to measure impact and dissemination. Altmetrics takes into consideration other things such as times downloaded documents, links, tweets, blog posts, discussions in online forum etc.
During the coming week there will be blog posts about altmetrics, impact of research published open access, research data and negative results. The last blog post on next Friday will be about how to find open access journals with good reputation and big crowd of readers.
Next week is the international Open access week. It will be drawing attention to it here in the blog. You could warm up before that by reading Richard Poynders series of interviews on the State of Open Access. He has interviewed many of the top people within open access. Most of these people are in agreement that there has been a lot of progress in the last century. That being said, there is still problems e.g. with implementation of open access.