By Ian Evans
In science, one aspect of transparency involves ensuring that the research methods and results are available to be analyzed, reused, critiqued and commented on. It’s essential to the process and integrity of science. Here are some of the ways Elsevier is helping make science more transparent.
1. We support the TOP Guidelines to enable data sharing across all our journals.
Last year, Elsevier became a signatory to the Transparency and Openness Promotion (TOP) guidelines. By introducing new journal data policies, we reinforced our support for transparency,
openness and reproducibility of research. Afterwards, Dr. Brian Nosek, Professor in the Department of Psychology at the University of Virginia and Executive Director of the Center for Open Science, which guided the development of the guidelines, commented: “We are delighted to collaborate with Elsevier to help improve transparency practices across their journals portfolio. The TOP Guidelines are one element of a broader strategy to shift cultural norms and incentives for improving transparency in publishing.” Read the full story in Editors’ Update.
openness and reproducibility of research. Afterwards, Dr. Brian Nosek, Professor in the Department of Psychology at the University of Virginia and Executive Director of the Center for Open Science, which guided the development of the guidelines, commented: “We are delighted to collaborate with Elsevier to help improve transparency practices across their journals portfolio. The TOP Guidelines are one element of a broader strategy to shift cultural norms and incentives for improving transparency in publishing.” Read the full story in Editors’ Update.
2. CiteScore provides transparent and comprehensive insights into journal impact.
In response to academia’s call for metrics that provide a broader, more transparent view of an academic journal’s citation impact, Scopus developed CiteScore metrics, a set of eight indicators that offer complementary views to analyze the impact of all serial titles — including journals — on Scopus. The CiteScore value, its monthly CiteScore Tracker, CiteScore Rank, CiteScore Quartile and CiteScore Percentile are part of the broader “basket of metrics.” They join SNIP (Source Normalized Impact per Paper) and SJR (SCImago Journal Rank) as metrics available for the journals indexed in Scopus. This set of metrics has complementary characteristics, providing a holistic view on journal performance. Read more about CiteScore.
Ian Rowlands, Research Information Specialist at King’s College London, told Elsevier Connect last year that there has been much interest in CiteScore among his colleagues there:
For me, what’s really interesting about CiteScore is that you’re using a longer time window than the classic journal Impact Factor. And … that should mean that some research areas that are perhaps more slow moving in their citation rates would be seen in a more positive light. And certainly some of the interactions I’ve had with academics at King’s College London, they’ve been very interested in not just CiteScore but some of these new metrics that are coming along like the Scopus Views. They’re really starting to get engaged with this.
3. We’re breaking down barriers to reproducibility.
Reproducibility of research is an important issue at Elsevier. As Donna de Weerd-Wilson and Dr. William Gunn wrote their January 2017 story “How Elsevier is breaking down barriers to reproducibility”:
We can help raise the bar on reproducibility by lowering barriers for researchers to publish replication studies, empowering researchers to share their methods and data, championing rigorous and transparent reporting, and creating outlets for research that upholds reproducibility.
In 2016, Cell Press introduced STAR Methods, which “outlines the features of a robust, reproducible method: structured, transparent, accessible and reporting.” Several other Elsevier journals also have policies that promote reproducibility, including the Mandatory Replication Policy in Energy Economics and Review of Economic Dynamics, the Scientific Checklist in Biochemical Pharmacology and the Invited Reproducibility Paper in Information Systems.
4. We have metrics for all types of scholarly output.
Dr. Bruce Herbert, Director of Scholarly Communications at Texas A&M University, wanted to show how the development of knowledge, innovations and creative works at his institution has made an impact on the world. To do that, he knew that the scholars and researchers there needed to be able to tell their research story in their own way, whether they were in a research department or a performance arts department.
To that end, Texas A&M partnered with Plum Analytics, which was acquired by Elsevier last year as one of the leading suppliers of altmetrics. That meant being able to show impact whether your scholarly output was a research article, clinical guidelines, or a musical performance on You Tube.
“What happens with these groups is that the impact of their works looks weaker when measured with the traditional tools,” Dr. Herbert explained in Elsevier Connect last year. “These scholars really benefit from altmetrics because now they have data or evidence of the impact their work has on the world that doesn’t exist in more traditional databases.”