Bibliography

2014
Zahedi, Zohreh, Rodrigo Costas, and Paul Wouters. “

How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis Of The Presence Of ‘Alternative Metrics’ In Scientific Publications

”. Scientometrics (2014): 1-23. Print.
Wang, Xianwen, et al.

Usage History Of Scientific Literature: Nature Metrics And Metrics Of Nature Publications

”. Scientometrics 98 (2014): 1923-1933. Print.Abstract
In this study, we analyze the dynamic usage history of Nature publications over time using Nature metrics data. We conduct analysis from two perspectives. On the one hand, we examine how long it takes before the articles’ downloads reach 50 %/80 % of the total; on the other hand, we compare the percentage of total downloads in 7, 30, and 100 days after publication. In general, papers are downloaded most frequently within a short time period right after their publication. And we find that compared with non-Open Access papers, readers’ attention on Open Access publications are more enduring. Based on the usage data of a newly published paper, regression analysis could predict the future expected total usage counts.
Sud, P, and M Thelwall. “

Evaluating Altmetrics

”. Scientometrics 98 (2014): 1131-1143. Print.Abstract
The rise of the social web and its uptake by scholars has led to the creation of altmetrics, which are social web metrics for academic publications. These new metrics can, in theory, be used in an evaluative role, to give early estimates of the impact of publications or to give estimates of non-traditional types of impact. They can also be used as an information seeking aid: to help draw a digital library user’s attention to papers that have attracted social web mentions. If altmetrics are to be trusted then they must be evaluated to see if the claims made about them are reasonable. Drawing upon previous citation analysis debates and web citation analysis research, this article discusses altmetric evaluation strategies, including correlation tests, content analyses, interviews and pragmatic analyses. It recommends that a range of methods are needed for altmetric evaluations, that the methods should focus on identifying the relative strengths of influences on altmetric creation, and that such evaluations should be prioritised in a logical order.
Haustein, Stefanie, et al.

Coverage And Adoption Of Altmetrics Sources In The Bibliometric Community

”. Scientometrics (2014): 1-19. Print.
Doemeland, Doerte, and James Trevino. “

Which World Bank Reports Are Widely Read?

”. 1 (2014): 1-34. Print.Abstract
Knowledge is central to development. The World Bank invests about one-quarter of its budget for country services in knowledge products. Still, there is little research about the demand for these knowledge products and how internal knowledge flows affect their demand. About 49 percent of the World Bank’s policy reports, which are published Economic and Sector Work or Technical Assistance reports, have the stated objective of informing the public debate or influencing the development community. This study uses information on downloads and citations to assesses whether policy reports meet this objective. About 13 percent of policy reports were downloaded at least 250 times while more than 31 percent of policy reports are never downloaded. Almost 87 percent of policy reports were never cited. More expensive, complex, multi-sector, core diagnostics reports on middle-income countries with larger populations tend to be downloaded more frequently. Multi-sector reports also tend to be cited more frequently. Internal knowledge sharing matters as cross support provided by the World Bank’s Research Department consistently increases downloads and citations.
Mohammadi, Ehsan, and Mike Thelwall. “

Mendeley Readership Altmetrics For The Social Sciences And Humanities: Research Evaluation And Knowledge Flows

”. Journal Of The American Society For Information Science And Technology 2014: n. pag. Print.Abstract
Although there is evidence that counting the readers of an article in the social reference site, Mendeley, may help to capture its research impact, the extent to which this is true for different scientific fields is unknown. This study compares Mendeley readership counts with citations for different social sciences and humanities disciplines. The overall correlation between Mendeley readership counts and citations for the social sciences was higher than for the humanities. Low and medium correlations between Mendeley bookmarks and citation counts in all the investigated disciplines suggest that these measures reflect different aspects of research impact. Mendeley data was also used to discover patterns of information flow between scientific fields. Comparing information flows based on Mendeley bookmarking data and cross disciplinary citation analysis for the disciplines revealed substantial similarities and some differences. Thus, the evidence from this study suggests that Mendeley readership data could be used to help capture knowledge transfer across scientific disciplines, especially for people that read but do not author articles, as well as giving impact evidence at an earlier stage than is possible with citation counts.
Barbaro, Annarita, Donatella Gentili, and Chiara Rebuffi. “

Altmetrics As New Indicators Of Scientific Impact

”. Journal Of The European Association For Health Information And Libraries 10 (2014): 3-6. Print.Abstract
In recent years, researchers and academics in growing numbers are starting to move their everyday work onto the Web,exploring new ways to spread, discuss, share and retrieve information outside of the traditional channel of scholarlypublishing. As scholarly communication moves increasingly online, there is a growing need to improve the ways inwhich the impact of scientific research output is evaluated. Altmetrics, even if they are still in an early stage, have thepotential to develop as complements to traditional metrics and to provide a useful insight into new impact types notincluded in existing measures. This paper summarises the major trends, opportunities and challenges of these newmetrics for both researchers and academic research libraries.
Shema, Hadas, Judit Bar-Ilan, and Mike Thelwall. “

Do Blog Citations Correlate With A Higher Number Of Future Citations? Research Blogs As A Potential Source For Alternative Metrics

”. Journal Of The Association For Information Science And Technology 65 (2014): 1018-1027. Print.Abstract
Journal-based citations are an important source of data for impact indices. However, the impact of journal articles extends beyond formal scholarly discourse. Measuring online scholarly impact calls for new indices, complementary to the older ones. This article examines a possible alternative metric source, blog posts aggregated at ResearchBlogging.org, which discuss peer-reviewed articles and provide full bibliographic references. Articles reviewed in these blogs therefore receive “blog citations.” We hypothesized that articles receiving blog citations close to their publication time receive more journal citations later than the articles in the same journal published in the same year that did not receive such blog citations. Statistically significant evidence for articles published in 2009 and 2010 support this hypothesis for seven of 12 journals (58%) in 2009 and 13 of 19 journals (68%) in 2010. We suggest, based on these results, that blog citations can be used as an alternative metric source.
Fenner, Martin, and Jennifer Lin. “

Novel Research Impact Indicators

”. 2014 23 (2014). Print.Abstract
Citation counts and more recently usage statistics provide valuable information about the attention and research impact associated with scholarly publications. The open access publisher Public Library of Science (PLOS) has pioneered the concept of article-level metrics, where these metrics are collected on a per article and not a per journal basis and are complemented by real-time data from the social web or altmetrics: blog posts, social bookmarks, social media and other.
2013
Gunn, William. “

Social Signals Reflect Academic Impact: What It Means When A Scholar Adds A Paper To Mendeley

”. Information Standards Quarterly 25 (2013): 33-39. Print.Abstract
The article offers information related to a U.S.-based academic social network Mendeley. It informs that Mendeley has emerged as one of the most interesting sources of altmetrics for measuring impact factor of the academic periodicals. It informs that Altmetrics or alternative metrics are so called as it is distinguish from bibliometric and a tool for measuring the impact factor of a periodical at an article level.
Hausten, Stefanie, et al.

Coverage And Adoption Of Altmetrics Sources In The Bibliometric Community

”. 14Th International Society Of Scientometrics And Informatics Conference. Vienna, Austria, 2013. Print.Abstract
Altmetrics, indices based on social media platforms and tools, have recently emerged as alternative means of measuring scholarly impact. Such indices assume that scholars in fact populate online social environments, and interact with scholarly products there. We tested this assumption by examining the use and coverage of social media environments amongst a sample of bibliometricians. As expected, coverage varied: 82% of articles published by sampled bibliometricians were included in Mendeley libraries, while only 28% were included in CiteULike. Mendeley bookmarking was moderately correlated (.45) with Scopus citation. Over half of respondents asserted that social media tools were affecting their professional lives, although uptake of online tools varied widely. 68% of those surveyed had LinkedIn accounts, while Academia.edu, Mendeley, and ResearchGate each claimed a fifth of respondents. Nearly half of those responding had Twitter accounts, which they used both personally and professionally. Surveyed bibliometricians had mixed opinions on altmetrics' potential; 72% valued download counts, while a third saw potential in tracking articles' influence in blogs, Wikipedia, reference managers, and social media. Altogether, these findings suggest that some online tools are seeing substantial use by bibliometricians, and that they present a potentially valuable source of impact data.
Piwowar, Heather A, and Todd J Vision. “

Data Reuse And The Open Data Citation Advantage

”. Xiaolei Huang. Peerj 1 (2013): e175. Print.Abstract
Background. Attribution to the original contributor upon reuse of published data is important both as a reward for data creators and to document the provenance of research findings. Previous studies have found that papers with publicly available datasets receive a higher number of citations than similar studies without available data. However, few previous analyses have had the statistical power to control for the many variables known to predict citation rate, which has led to uncertain estimates of the “citation benefit”. Furthermore, little is known about patterns in data reuse over time and across datasets. Method and Results. Here, we look at citation rates while controlling for many known citation predictors and investigate the variability of data reuse. In a multivariate regression on 10,555 studies that created gene expression microarray data, we found that studies that made data available in a public repository received 9% (95% confidence interval: 5% to 13%) more citations than similar studies for which the data was not made available. Date of publication, journal impact factor, open access status, number of authors, first and last author publication history, corresponding author country, institution citation history, and study topic were included as covariates. The citation benefit varied with date of dataset deposition: a citation benefit was most clear for papers published in 2004 and 2005, at about 30%. Authors published most papers using their own datasets within two years of their first publication on the dataset, whereas data reuse papers published by third-party investigators continued to accumulate for at least six years. To study patterns of data reuse directly, we compiled 9,724 instances of third party data reuse via mention of GEO or ArrayExpress accession numbers in the full text of papers. The level of third-party data use was high: for 100 datasets deposited in year 0, we estimated that 40 papers in PubMed reused a dataset by year 2, 100 by year 4, and more than 150 data reuse papers had been published by year 5. Data reuse was distributed across a broad base of datasets: a very conservative estimate found that 20% of the datasets deposited between 2003 and 2007 had been reused at least once by third parties. Conclusion. After accounting for other factors affecting citation rate, we find a robust citation benefit from open data, although a smaller one than previously reported. We conclude there is a direct effect of third-party data reuse that persists for years beyond the time when researchers have published most of the papers reusing their own data. Other factors that may also contribute to the citation benefit are considered. We further conclude that, at least for gene expression microarray data, a substantial fraction of archived datasets are reused, and that the intensity of dataset reuse has been steadily increasing since 2003.
Ovadia, Steven. “

When Social Media Meets Scholarly Publishing

”. Behavioral & Social Sciences Librarian 32 (2013): 194-198. Print.
Galligan, Finbar, and Sharon Dyas-Correia. “

Altmetrics: Rethinking The Way We Measure

”. Serials Review 39 (2013): 56-61. Print.Abstract
AbstractAltmetrics is the focus for this edition of ?Balance Point.? The column editor invited Finbar Galligan who has gained considerable knowledge of altmetrics to co-author the column. Altmetrics, their relationship to traditional metrics, their importance, uses, potential impacts, and possible future directions are examined. The authors conclude that altmetrics have an important future role to play and that they offer the potential to revolutionize the analysis of the value and impact of scholarly work.
Cheung, Man Kit. “

Altmetrics: Too Soon For Use In Assessment

”. Nature 494 (2013): 176. Print.
Narayan, Bhuva, et al.

Social Media As Online Information Grounds: A Preliminary Conceptual Framework

”. Digital Libraries: Social Media And Community Networks. ShaliniR Urs, Na, Jin-Cheon, & Buchanan, George. Springer International Publishing, 2013. 127-131. Print.
Priem, Jason. “

Scholarship: Beyond The Paper

”. Nature 495 (2013): 437-440. Print.Abstract
The author discusses the replacement of journal and articles by algorithms that rate, filter and disseminate scholarship. He explores the transition of the peer-review system from paper-native to Web-native system that uses Web technology to enhance dissemination, wherein, web-native articles have inconsistent quality and require filtering compared to journal articles. Topics discussed include dissemination, certification, and reward structure of scholarship.
Adie, Euan, and William Roe. “

Altmetric: Enriching Scholarly Content With Article-Level Discussion And Metrics

”. Learned Publishing 26 (2013): 11-17. Print.Abstract
Scholarly content is increasingly being discussed, shared, and bookmarked online by researchers. Altmetric is a start-up that focuses on tracking, collecting, and measuring this activity on behalf of publishers; here we describe our approach and general philosophy. Over the past year we have seen sharing and discussion activity around approximately 750,000 articles. The average number of articles shared each day grows by 5-10% a month. We look at examples of how people are interacting with papers online and at how publishers can collect and present the resulting data to deliver real value to their authors and readers. [ABSTRACT FROM AUTHOR]
robincr uw edu Roemer, Robin Chin, and Rachel borchard american edu Borchardt. “

Institutional Altmetrics & Academic Libraries

”. Information Standards Quarterly 25 (2013): 14-19. Print.Abstract
The article offers information related to the importance of altmetrics in periodical publication. It mentions that altmetrics, is a tool for measuring impact of an article level. It informs that relationship between academic researches and altmetrics is helpful to look at the state of institutional bibliometric. It reports that academic libraries may or may not continue to be the core brokers of impact metrics but altmetrics faculty will remain play a core role within higher education.
Rodgers, Emily, and Sarah Barbrow.

A Look At Altmetrics And Its Growing Significance To Research Libraries

. The University of Michigan University Library, 2013. Print.Abstract
Many people involved in the scholarly communications process – from academics, students, and researchers, to publishers, librarians, and learners – are participating in a dynamic digital context now more than ever; moreover, digital acts of communication and dissemination of scholarship leave traces of impact that can now be culled and quantified. Altmetrics, metrics based on the social web, provide an opportunity both to more acutely measure the propagation of this communication and to reconsider how we measure research impact in general. While the use of social media and analytics and the structure of tenure and promotion practices are not consistent across or even within disciplines, the practices and experimentation of early adopters, from researchers and institutions to industry, yield stories, lessons learned, and practices worth investigating. Researchers and academic librarians both face new opportunities to engage and support the use of altmetrics tools and methods and to re-examine how scholarship is defined, collected, preserved, used, and discussed. This report summarizes the major trends, opportunities and challenges of altmetrics to both researchers and academic research libraries and outlines ways in which research libraries can participate in shaping this emergent field. Also featured in this article is a micro-case study featuring a partnership between the University of Pittsburgh and Plum Analytics that illustrates how libraries can begin to map out their role on campus in this arena.

Pages