We’d like to start by welcoming Steven Hyman, Alan Schechter, Judith Kimble and Marvin Wickens to the F1000 Research Advisory Panel. Steven is Director of the Stanley Center for Psychiatric Research at the Broad Institute, and Harvard University Distinguished Service Professor of Stem Cell and Regenerative Biology. Alan is Chief of the Molecular Medicine Branch of the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) at the National Institutes of Health (NIH). Judith and Marvin are both at the University of Wisconsin-Madison, where Judith is Henry Vilas Professor in the Departments of Biochemistry and Medical Genetics, and an Investigator with the Howard Hughes Medical Institute (HHMI), while Marvin is Max Perutz Professor of Molecular Biology and Biochemistry in the Department of Molecular Biology and Biochemistry. We are delighted to have them on board our growing Panel, which now comprises over 100 of some of the biggest names in biology and medicine, to help advise us as we work through the many complex issues involved in the F1000 Research project.
Altmetrics in the wild
The review addresses four important questions regarding the utility of altmetrics:
- How much and what kind of altmetrics data exist?
- How are altmetrics distributed over time
- How do altmetrics relate to one another and to traditional citations?
- Can we cluster articles of different impact types using altmetrics?
The authors highlight three particularly important points from their findings. Firstly, there is no shortage of data from altmetrics sources, though different indicators vary greatly in activity: whilst 5% of sampled articles are cited in Wikipedia, around 80% of sampled articles have been included in at least one Mendeley library. Secondly, though altmetrics and citations track forms of impact that are distinct, they are nonetheless related, and neither approach is able to completely describe scholarly use alone. Finally, articles cluster in ways that suggest different impact “flavors”, as Heather has blogged about previously in the flavors of research impact through altmetrics.
A post from Mark Hahnel at FigShare reiterates a question that we at F1000 Research know all too well, a question posed by many researchers when considering publishing their work on the web: will it compromise my future chances of publishing in a peer-reviewed journal? As Mark points out, this fear stems from the Ingelfinger rule, created in 1969 by the then-editor of the New England Journal of Medicine (NEJM), as a way to ensure the NEJM retained the ‘originality factor’. Obviously, this rule seems more than a little outdated, especially when one considers the exponential technological advances we’ve witnessed since its creation. We have been working with journals on this subject for some time now with respect to F1000 Posters and whether posting your conference poster or slides there would count as a prior publication (see our list of responses). This matter was taken up further in Ivan Oransky’s blogs in Embargo Watch here and here. Luckily, for the most part, opinion has changed to fit the times, and we have a full list of journals and publishers who would not view publication of datasets with a DOI and associated protocol information as prior publication, as well as a couple that indicated that they still would.
ScienceOnlineNOW went live earlier this week – a new hub for ScienceOnline, “a non-profit organization that facilitates discussion about science through online networks and face-to-face events”. The date for ScienceOnline2013 was also announced: Jan 31 – Feb 2 at North Carolina State University, as well as details of other forthcoming events such as a monthly ScienceOnline NYC and, starting in April 2012, ScienceOnline West Coast. You can also visit our own Upcoming Meetings list for information on other key events – and please do let us know of any others you think may be of particular interest!