Library Breakfast on open access and locked articles

During the International Open Access Week, we invite researchers and students at the University to a Library Breakfast on the theme of open access. This year, a lot has happened in the area of open access in Sweden. During this breakfast, research librarian Signe Wulund will, tell us about what happened to the publisher Elsevier, and present alternative routes to locked research publications. If you want to participate, you need to register before October 22nd using this form.

The breakfast consists of coffee / tea / juice, sandwich and fruit and are open to students and staff at the University of Borås. However, there are only 25 seats, and registration is mandatory in order to participate.

Text: Katharina Nordling
Photo: Mostphotos

Find open access journals with impact

120x600oaThere have been blog posts about altmetrics, impact of open access published research, research data and negative results. Maybe it is time to present some good examples on quality open access journals or at least how to find them.

There are some given examples as PLoS with its many journals, BioMed Centrals journals och PEER J. De första är megatidskrifter och alla inom medicin, life-sciences, biomedicin och liknande. arXiv.org is a preprint archive for physics, chemistry, mathematics and computer science etc.

Web of Science has listed open access journals. Here you can browse the journal title or search for a specific title.

ITC library has created a list on open access journals with impact factor. There are over eight hundred journals from all areas of research on the list. Unfortunately there is no year as to when the list was made so some of the impact factors might have gone up, some down.

University of Oregon has also created a ranking list with the help of Scopus and SJR. All 2014 open access journals which are included in SCImago Journal & Country Rank. They have also created a list where open access journals are ranked according to eigenfactor.org. Eigenfactor is bild so that it takes into consideration the different citation practices in research areas and data for the past five years are included. Citations from important journals are weighed higher than citations from lower ranking journal.

There is also a list on open access journals  which have an impact factor from Web of Science from 2009, but due to copyright only ScimagoJR-indicator and Scopus SNIP-indicator are shown. SJR and SNIP are available on the open web. 619 journals are on the list. You can check a journal’s SJR and SNIP at SCImago Journal & Country Rank.

ScimagoJR and SNIP-indicator are measurements on a journals scientific prestige respective a kind of normalized value per article. E.g. SJR values citations from related journals higher and compensates for the number of issues a journal has. SNIP aims to make comparisons between different research areas possible, e.g. citations in medicine could be compared with humanities. This has its problems which are not discussed here.

Check our a youtube film on SJR & SNIP vs impact factor if you are interested.

You might have noticed that is is quite easy to find open access journals in medicine, biology and other sciences. It is more difficult to find open access journals in other areas, especially with impact factor. This is partly due to the fact that other areas are not represented in Web of Science as well.

DOAJ, Directory of Open Access journals is a good place to look for journals. You can search for a journal, browse according to a couple of categories such as new journals, subject, country of origin, license (one of creative common licenses) and APC, article processing charge. You can find journals within e.g. Arts & architecture, history & archaeology  and social sciences.

Pieta Eklund

Negative results = no results, or contribution to knowledge?

120x600oaNegative results, results which do not support the expressed hypothesis or results which fall too far from the expected results are rarely analyzed. These results are not published as often either. There is a bias, so called publication bias towards publishing the positive, to publish the results which support the hypothesis. This is not seen as a research ethical problem. This problem is partly due to publish-or-perish culture; there is a competition to publish and to get citations to be able to be in the competition for research funding. Publication bias gives a twisted image of the research area and research literature. It can even lead to researchers manipulating research data.

Scientific journals are not interested in publishing replicated studies because they lack in news value. They are not interested in to publish negative results although a lot of important research which we regard as the truth has not been possible to replicate later. Some postdocs went as far as creating Journal of Negative Results: Ecology & Evolutionary Biology. This is not a sustainable solution. Instead, more research data should be made available and the norms in scientific communication should be changed: researchers should write exact what they have done. Maybe this has not been possible earlier with print journals with  word limits. With electronic journals and open access this is now possible. Many open access journals are happy to give more space for extended methodological descriptions and discussions.

Ben Goldacre: What doctors don’t know about the drugs they prescribe – tells about a study which showed that some university students had the ability to see into the future. We hear only about the times when someone has been able to do something which maybe leads us to believe in it. We believe that a scientific article is correct, that certain medicine work well against e.g. depression. The fact that we do not know is that only the positive results have been reported. So, we hear only of the cases when the oracle was correct, not the times it was wrong. Within medicine and pharmacy research of reporting of only the positive could be lethal. Goldacre gives an example of a medicine which has been studied in a number of studies, 38 with positive results and 36 with negative results. 37 of the studies with positive results were published. Meanwhile only 3 of the negative studies where published. In The Power of Negative Thinking by Jennifer Couzin-Frankel writes about the same phenomenon: only a fraction of the studies were possible to replicate.

Researchers should be encouraged to publish negative results and it should also become easier to publish negative results. Peer J, open access journal in medicine and biology writes on their web page that they publish methodologically and theoretically correct articles – the results does not have be news worthy. PEER J writes that ”negative/inconclusive results are acceptable. They also write that all research data should be available for the reviewer and if possible made available even for others. They even publish the reviewer’s comments. Maybe one of the reasons Peer J is able to work this way is that it is a newly started open access journal. Scientific publishing is quite traditional and protectionistic area: publishers are not quick to implement new ways to work.

To work for knowledge ambiguous results should be published and research data should be made easily available, which is what open access works for.

Watch Ben Goldacre’s TED Talks on publication bias and what it could have for effect.

The film is about 14 mins long.

Read also: Fanelli, D. (2012). Negative results are sidappearing from most diciplines and countries. Scientometrics, 90, ss. 891-904. 10.1007/s11192-011-0494-7

Read even short essays on negative results in Marin Ecology Porgress Series from 1999. The essays talk about negative results and remind us that even positive results should be eyed with skepticism.

Pieta Eklund

Research data

120x600oa”Publishing research without data is simply advertising, not science”

Is it so? Is publishing research results without research data just marketing and not science as Graham Steel says. The statement is a modification of what Claire Bower said: ”Publishing articles without making the data available is scientific malpractice”. I think that these two are trying to say that making research data available is the next step in making science open access.

To make research data available is based on the idea to make it possible for others to reproduce the results and to test the reliability of the results but also to get more out of research data. Also, it has to do with getting better understanding e.g. of research field where negative results are rarely published, i.e. biomedicine. To be able to reproduce results better management of research results and linking between research data and publication is needed. There is also a need to be able to find research data, to know who has collected the data and a possibility to cite the data the same way as research results are cited.

Last summer EU invited interested parties to discuss open research data, e.g. researchers, research funders, system developers and librarians. Their input is important and will have effect on the coming big EU research funding programme Horizon 2020. The starting points for the discussions were the following questions:

  • How can we define research data and what types of research data should be open?
  • When and how does openness need to be limited?
  • How should the issue of data re-use be addressed?
  • Where should research data be stored and made accessible?
  • How can we enhance data awareness and a culture of sharing?

Sweden has signed OECD’s declaration on access to research data from public funding. The declaration’s main point is to support access and sharing of research data. Those who have signed it have also committed to working for making digital research data available according to the goals ad principles listed in the declaration, i.e. openness, international standards, rules and areas of responsibility for all involved parties.

It was clear in the discussions that commercial data or data which can be traced back to an individual should be kept secret but some aggregated data should be open. There was also talk about embargo periods for data coming from co-operation between public and private enterprise so that investors could still be attracted finance research.

Something which should be thought about when discussing research data is that the one collecting the data has the best understanding of data’s limitations, which might not be clear to those who are second or even third in line to analyze the data. Therefore, analyzing research data should be embarked on with caution.

Making research data available has bigger obstacles on the road than open access due to the exceptions to personal integrity, business and national secrets, commercial interests, intellectual property law and all the other arguments lawyers can come up to keep research data way from the obligations of open access.

Pieta Eklund