Going … going …
A study recently published in Current Biology 1 investigates the availability of research data from papers published up to 20 years ago, and the findings are startling, even to those who have long advocated for better stewardship of research data.
Timothy Vines, an ecologist at the University of British Columbia, set out to explore the idea “that authors are poor stewards of their data, particularly over the long term,” a premise that underlies current legislative and other efforts to ensure longterm preservation and access to research data by mandating that datasets be made available through a public repository, and that grant applications include a formal plan for managing research data.
Vines and his team set out to test this premise by requesting datasets from 516 ecology articles written between 1991 and 2011. Among his findings:
• The odds of a data set being reported as extant fell by 17% per year. In fact, for papers published in 1991, data could be confirmed as extant for only 2 of the 26 papers examined. For the oldest papers in the study, some 80% of the data may, in effect, be lost.
• Broken e-mails and obsolete storage devices were the main obstacles to data sharing. Working e-mail addresses for the first, last, or corresponding author fell by 7% per year; for the oldest (1991) papers, 35% of the email addressed did not work. Overall, Vines and his group received responses to their inquiries from only 37% of the studies’ authors (no working email address for 25% overall; no response received for 38% overall). As to the data’s availability, Vines told Nature that
“‘[m]ost of the time, researchers said ‘it’s probably in this or that location’, such as their parents’ attic, or on a zip drive for which they haven’t seen the hardware in 15 years,’ says Timothy Vines, the lead author on the study and an evolutionary ecologist at the University of British Columbia in Vancouver. ‘In theory, the data still exist, but the time and effort required by the researcher to get them to you is prohibitive.’”
Vines concludes that “policies mandating data archiving at publication are clearly needed” and argues that journals are in a strong position to require that data be submitted to a public repository as a condition of publication. A number of well-established data repositories exist; see Databib for examples, or speak to your friendly librarian for more information. You may also wish to consult this guide to research data management, developed by the Library’s Office of Scholarly Publishing & Communication.
1. Vines TH, Albert AYK, Andrew RL, Débarre F, Bock DG, Franklin MT, Gilbert KJ, Moore J, Renaut S, Rennison DJ (2013) The availability of research data declines rapidly with article age. Current Biology, online in advance of print. doi:10.1016/j.cub.2013.11.014
Data for Vines’ study is available through the Dryad digital data repository:
Vines TH, Albert AYK, Andrew RL, Débarre F, Bock DG, Franklin MT, Gilbert KJ, Moore J, Renaut S, Rennison DJ (2013) Data from: The availability of research data declines rapidly with article age. Dryad Digital Repository. doi:10.5061/dryad.q3g37
Gibney, E., Van Noorden, R., 2013. Scientists losing data at a rapid rate. Nature.
Raw Data’s Vanishing Act. The Scientist. 2013.
Gear Up is a great opportunity to explore services and tools, and speak with people available on campus who can support your research endeavors. At the “Impact of Your Work” table, we explore different tools that show the impact of your work/research. Journal articles are the traditional form of publication, but it is only one way to disseminate work. The graphic (right) shows an array of possibilities.
Publication citation is only one way to measure impact. Most people have heard of the h-index, which is the number of papers, h, that have been cited at least h times. You can find your h-index through Web of Science or Google Scholar Citations.
- In Web of Science, the most accurate way to generate a citation report is to do an “Author Search” and follow the prompts that are meant to find the right author (by field and by affiliation). In addition to h-index, the citation report shows your publication count by year and the number of citations received by year.
- Google Scholar Citations is another place to find the h-index. You do need to sign up and create a profile (which can be public or private). It can be set up to automatically or manually populate with your publications. The metrics are immediately updated. Check out Prof. David Kotz’s profile as an example!
Some researchers have very common last names and first (and middle) initials so it is difficult to pinpoint their work exactly. Hence, we recommend all researchers sign up for an ORCID identifier, which is similar to having a Social Security Number. Many publishers and funding agencies are now including this as a field in submissions. No matter what variation of your name gets used, as long as it’s associated with this ID, you’ll get credit!
To raise your impact, you have to broaden your reach. One way is to make your work as openly available as possible. For example, you can choose to archive on your website, deposit in a repository, or publish open access.
- Sherpa ROMEO allows you to search for your publisher’s copyright and self-archiving policies. It’s easy to figure out if you can use the publisher’s final version on your website!
- The Registry of Open Access Repositories is a listing of world-wide repositories. Dartmouth does not (yet) have an institutional repository (see Carole Meyers if you’d like to learn more).
- If you’re considering publishing open access, talk to us (namely, Barbara DeFelice)! Dartmouth supports the publication fees for open access journals that qualify under the Compact for Open Access Publishing Equity (COPE).
You can also disseminate your work in many different forms, including figures, graphics, presentations, datasets, code, etc. There are several sites that help facilitate this: figshare, slideshare, github, Dryad, YouTube, etc. This results in new ways of measuring impact and redefining what that means.
- Altmetrics.org tries to keep track of the most recent tools to have arisen.
- ImpactStory aggregates impact data from a variety of sources and shows impact of a variety of different forms of dissemination. See a sample profile here.
- Research Gate is a tool that is growing in popularity here at Dartmouth. I spent some time looking into this and have written up a separate blog post about it. It also has its own metric called an “RG score.”
- We also have an extensive listing of tools on our Scholarly Publishing & Communication guide.
The scholarly communications landscape is constantly changing and keeping up with trends can be a challenge, but we are here to help! Contact your favorite librarian anytime.
- Roemer, R. C. and Borchardt, R. (Nov 2012). “From bibliometrics to altmetrics: a changing scholarly landscape.” College & Research Libraries News, 73 (10). pp. 596-600.
- Kear, R. and Colbert-Lewis, D. (Sept 2011). “Citation searching and bibliometric measures: resources for ranking and tracking.” College & Research Libraries News, 72 (8). pp. 470-474.
- Howard, J. (Feb 2012). “Tracking Scholarly Influence Beyond the Impact Factor.” The Chronicle of Higher Education.
- Henning, V. and Gunn, W. (Sept 2012). “Impact factor: researchers should define the metrics that matter to them.” The Guardian.
- Quigley, J. “Altmetrics – Alternative Metrics for Articles (think: impact 2.0).” Kresge Physical Sciences Library and Cook Mathematics Collection blog.
When people ask me what aspect of math I studied as a math major, I like to say the intersection of math and art. Although I haven’t studied the mathematical aspects in depth, I love origami and have been folding on and off for the last 15 years. Recently, I’ve been folding lucky stars (see my other post for more pictures), but I want to go back to working on modular origami soon.
Plus magazine published a really interesting article on “the power of origami.” The author talks about the impact origami has made in science and technology and touches on the basics of the math behind it. Big names in origami-math include Robert J. Lang and Thomas Hull. Come check out some of the books we have at the Library!
And go see the “book” Fun Origami at Rauner.
ACS Announces New OA Initiatives
The American Chemical Society (ACS) has just announced four new initiatives that, taken together, expand the range of options available to authors, readers, researchers, and others interested (perhaps as a matter of principle) in the expansion of access to scholarship, particularly scholarship that was funded by taxpayer-supported government granting agencies such as the National Institutes of Health or the National Science Foundation.
The four new options, which will take effect in 2014, are:
* ACS Central Science, a new, highly selective, peer-reviewed journal that will be completely open access, to launch in 2014.
* ACS Editors’ Choice will release one article each day during 2014, to be available open access; the articles will be selected by the scientific editors of ACS journals from articles published in 2014 (this seems a little random, like subscribing to a Word of the Day to enrich your vocabulary, but could create an interesting body of material).
* ACS Author Rewards provides credits to the corresponding author of articles published during 2014. The credits can be used towards paying Open Access publishing fees during 2015-17 (ACS charges a standard fee for publishing OA, with significant discounts to ACS members affiliated with a subscribing institution).
* ACS AuthorChoice expands its options for authors to pay fees to make their articles immediately available to all. Author Choice has been an option since 2006, so this is not a major departure from previous practice, but authors can now choose between immediate or (less expensive) 12-month embargoed open availability of their articles (I could not help but notice that overall ACS Author Choice fees, as posted, will go up some 25% in 2014, so perhaps this last initiative is a bit of a wash) (see 2013 pricing options, Option A vs. 2014 AuthorChoice options).
Coming just after ACS’ announcement of the completion of its project to digitize and make freely available the historic supporting information for published ACS articles (“ACS Digitizes Legacy Data (and makes it freely available to all“), these are real advances in support of ACS’ mission to ““to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people.”
SI accompanying an early Sharpless article (JACS, 1981)
This just in! We’d heard about this initiative before (“ACS – 2013 Initiatives (aka good news!)“) and it appears that the work is now complete. Congratulations to ACS for this great contribution, – made freely available to all, with relevant data linked from articles’ abstract pages.
“ACS Publications today announces the completion of a comprehensive undertaking to digitally convert and conserve the Supporting Information for its broadly subscribed ACS Legacy Archives journals collection. This initiative was part of the Society’s commitment to broaden the online accessibility of the supporting information and data associated with the ACS Legacy Archives –– a premium collection of nearly half a million original research articles published in ACS journals between the years 1879 and 1995. The digitization effort has generated new Supporting Information files for 40,000 ACS original research articles, and in total comprises 800,000 pages of highly valuable data and underlying research information.”
… and, from the full press release:
“Among the extensive collection of the newly available digital information are many noteworthy examples of data that supported published scientific breakthroughs, such as:
And now, about that microfiche …
Filed under: Chemistry, Publishing