Professional Documents
Culture Documents
DOI 10.1007/s11192-016-2077-0
Yin-Leng Theng1
123
1118 Scientometrics (2016) 109:1117–1166
Introduction
Since 2010, altmetrics has been emerging as a new source of metrics to measure scholarly
impact (Priem et al. 2010). Traditional impact indicators, known as bibliometrics, are
commonly based on the number of publications, citation counts and peer reviews of a
researcher or journal or institution (Haustein and Larivière 2015). As research publications
and other research outputs were increasingly placed online, usage metrics (based on
download and view counts) as well as webometrics (based on web links) (Priem and
Hemminger 2010; Thelwall 2012b) emerged. These days, research outputs have become
more diverse and are increasingly being communicated and discussed on social media.
Altmetrics are based on these activities and interactions on social media relating to
research output (Weller 2015).
Although there is no formal definition of altmetrics, several definitions have been
proposed. From the vision presented in the Altmetrics Manifesto by Priem et al. (2010),
altmetrics is defined as: ‘‘This diverse group of activities (that reflect and transmit scholarly
impact on Social Media) forms a composite trace of impact far richer than any available
before. We call the elements of this trace altmetrics.’’; as well as this definition on alt-
metrics.org: ‘‘altmetrics is the creation and study of new metrics based on the Social Web
for analyzing, and informing scholarship.’’1 Also from Priem et al. (2014, p. 263): ‘‘(...)
altmetrics (short for ‘‘alternative metrics’’), an approach to uncovering previously-invisible
traces of scholarly impact by observing activity in online tools and systems.’’ Another
definition has been proposed by Piwowar (2013, p. 159): ‘‘(...) scientists are developing
and assessing alternative metrics, or ’altmetrics’—new ways to measure engagement with
research output.’’ A definition from Haustein et al. (2014a, p. 1145) states: ‘‘Altmetrics,
indices based on social media platforms and tools, have recently emerged as alternative
means of measuring scholarly impact.’’ And recently, Weller (2015, pp. 261–262) propose
these definitions: ‘‘Altmetrics–evaluation methods of scholarly activities that serve as
alternatives to citation-based metrics (...)’’ and ‘‘Altmetrics are evaluation methods based
on various user activities in social media environments.’’ In a white paper from NISO
(2014) (National Information Standards Organization),2 altmetrics is described
as: ‘‘‘‘Altmetrics’’ is the most widely used term to describe alternative assessment metrics.
Coined by Jason Priem in 2010, the term usually describes metrics that are alternative to
the established citation counts and usage stats—and/or metrics about alternative research
outputs, as opposed to journal articles.’’
In summary, the common understanding across all definitions is that altmetrics are new
or alternative metrics to the established metrics for measuring scholarly impact. The main
difference in the definitions however is in how and where altmetrics can be found—
activities on Social Media, based on the Social Web, observing activity in online tools and
systems, engagement with research output, based on social media platforms and tools,
scholarly activities or various user activities in social media environments.
The main advantages of altmetrics over traditional bibliometrics and webometrics is
that they offer fast, real-time indications of impact, they are openly accessible and
transparent, include a broader non-academic audience, and cover more diverse research
outputs and sources (Wouters and Costas 2012). Altmetrics, however, also face several
challenges such as gaming, manipulation and data quality issues (Bornmann 2014b). As
1
http://altmetrics.org/about/. Accessed 18 Feb 2016.
2
http://www.niso.org/topics/tl/altmetrics_initiative, Accessed 18 Feb 2016.
123
Scientometrics (2016) 109:1117–1166 1119
the awareness of altmetrics grows, an increasing number of people from different sectors,
disciplines and countries are showing more interest in altmetrics and want to know its
pitfalls and potentials. Universities, libraries, funding agencies, and researchers have
common concerns and questions regarding altmetrics. For example, what are altmetrics?
When did research on altmetrics commence and what topics have been investigated? How
do altmetrics compare to traditional metrics? Are there any studies measuring this and what
are their findings?
This paper aims to give an overview of this emerging research area of measuring
research impact based on social media and to address some of these questions. There have
been compilations on existing altmetrics tools such as by Chamberlain (2013), Peters et al.
(2014), Priem and Hemminger (2010), Wouters and Costas (2012), and more recent list-
ings by Kumar and Mishra (2015), and by Weller (2015), where a review of the altmetrics
literature is given. Our paper aims to give a compact and yet comprehensive overview of
the altmetrics landscape, applying and considering the frameworks and listings made in
previous works. In ‘‘The altmetrics landscape’’ section, an overview is given of the alt-
metrics landscape depicting the inter-relationships between the different aggregators,
comparing their different features, data sources, and social media events. In ‘‘Literature on
altmetrics research’’ section, we analyse a cross-section of the academic literature relating
to altmetrics research, thereby highlighting the trends and research topics handled in recent
years. In ‘‘Results of research on altmetrics’’ section the consolidated results across mul-
tiple studies on coverage of altmetrics are presented as well as the results of a meta-
analysis of cross-metric validation studies comparing altmetrics to citations and to other
altmetrics. We conclude in ‘‘Conclusion and outlook’’ with a discussion on the challenges
facing this relatively new research area and highlight the gaps and future topics.
According to Haustein et al. (2016), a research object is an agent or document for which
an event can be recorded. Events are recorded activities or actions that capture acts of
accessing, appraising or applying research objects. Altmetrics are based on these events.
Research documents include very diverse artifacts, for example, traditional documents
could be journal articles, book chapters, conference proceedings, technical reports, theses,
dissertations, posters, books, and patents. These are hosted typically on publisher’s web-
sites, in online journals and in digital libraries such as PMC (PubMed Central), Scopus,
PLOS, Elsevier, or Springer. Research documents hosted on social media comprise more
modern research artifacts such as presentation slides on SlideShare, lectures videos on
YouTube, blog posts on ResearchBlogger, datasets on Dryad, and software code on
GitHub. Research agents could be individual scholars, research groups, departments,
universities, institutions or funding agencies. Research agents are hosted usually on
research or academic social networks such as ResearchGate, and Academia.edu. The alt-
metrics landscape covers various social media applications and platforms as data sources
for altmetrics. Data sources record events related to research objects, making these
available usually via an API, an online platform, a repository, or a reference manager.
Hosts of research objects often act as data sources, for example, Mendeley3 records events
on articles such as saved or read and provides these events to altmetrics consumers.
Altmetrics consumers comprise so called aggregators or providers of altmetrics who track
3
Mendeley is an online reference manager for scholarly publications.
123
1120 Scientometrics (2016) 109:1117–1166
and aggregate the various events gathered from social media data sources Haustein et al.
(2016). These aggregated events are made available as altmetrics to the end-users, who are
usually researchers, faculty staff, libraries, publishers, research institutions, universities, or
funding agencies. In the following, we highlight the major altmetrics aggregators and make
a comparison of the features they offer and the different data sources they use.
Altmetric aggregators
In the last few years, several aggregators have been created that either act as providers of
altmetrics, as impact monitors or metric aggregators. In Fig. 1, an overview of the alt-
metrics aggregators is given, showing how they interact with one another. Some aggre-
gators use data from other aggregators thus becoming secondary or tertiary aggregators.
The information compiled here is based on the information gathered from the aggregtors’
websites or blogs (accessed as of 18 December, 2015) and is subject to change as updates
are made to their websites and blogs regularly. Of course, some hosts such as academic
social networks like ResearchGate or Academia.edu, or some publishers could also be
considered as aggregators or providers as they increasingly track events (mostly usage
metrics like views or downloads) on their platforms. These hosts however mainly collect
locally generated events and presently do not aim to aggregate events from different
sources. Thus, we consider them not to be aggregators but rather to be hosts and in some
cases data sources.
(a) Altmetric.com4 (Adie and Roe 2013) was started in 2011 and provides the
Altmetric Score which is a quantitative measure of the attention that a scholarly
article has received, derived from three major factors: volume; sources and authors.
A free bookmarklet is offered for researchers and access to an API, embeddable
badges and an explorer are offered at a fee. A web application is provided for
Scopus, called Altmetric for Scopus and PLOS Altmetric Impact Explorer that allows
browsing conversations collected for papers published by PLOS.
(b) Impactstory5 was earlier known in 2011 as Total Impact (Priem et al. 2012a). It is
an open-source, web-based tool that helps scientists explore and share the impacts of
their research outputs by supporting profile-based embedding of altmetrics in their
CV (curriculum vitae) (Piwowar and Priem 2013). Some of the altmetrics are reused
from Altmetric.com (social mentions) and PLOS ALM (HTML and PDF views) as
shown in Fig. 1.
(c) Plum Analytics6 (Buschman and Michalek 2013) tracks more than 20 different
types of artifacts and collects 5 major categories of impact metrics: usage, captures,
mentions, social media and citations. Metrics are captured and correlated at the
group level (e.g., lab, department, or journal). Plum Analytics compiles article level
usage data from various sources including PLOS ALM as shown in Fig. 1. Plum
Analytics, founded in 2011, offers several products including: PlumX ALM, Plum
Print, PlumX Artifact Widget and Open API.
(d) PLOS ALM7 (Lin and Fenner 2013b) was launched in 2009, and provides a set of
metrics called Article-Level Metrics (ALM) that measure the performance and
4
http://www.altmetric.com. Accessed 18 Feb 2016.
5
https://impactstory.org. Accessed 18 Feb 2016.
6
http://www.plumanalytics.com. Accessed 18 Feb 2016.
7
http://article-level-metrics.plos.org. Accessed 18 Feb 2016.
123
Scientometrics (2016) 109:1117–1166 1121
Mendeley, Snowball
Twitter,
Plum Analytics
Metrics
SlideShare,
Blogs, PLOS ALM Impactstory
Reddit,
F1000,
Wikipedia, Altmetric.com
GitHub, Webometric
Dryad,
Analyst
Figshare,
RePEc
... Kudos
outreach of research articles published by PLOS. The ALM API and dataset are
freely available for all PLOS articles. PLOS also offers an open-source application
called Lagotto, that retrieves metrics from a wide set of data sources, like Twitter,
Mendeley, and CrossRef.
(e) Kudos8 is a web-based service that supports researchers in gaining higher research
impact by increasing the visibility of their published work. Researchers can describe
their research, share it via social media channels, and monitor and measure the effect
of these activities. Kudos also offers services for institutions, publishers, and
funders. Kudos compiles citation counts from WoS, altmetrics from Altmetric.com
(as shown in Fig. 1). In addition, share referrals and views on Kudos are also
collected.
(f) Webometric Analyst9 (Thelwall 2012a) was formerly known as LexiURL
Searcher (Thelwall 2010). It uses URL citations or title mentions instead of
hyperlink searches for network diagrams, link impact reports, and web environment
networks. Webometric Analyst searches via Mendeley and Bing for metrics and
reuses altmetrics from Altmetric.com, see Fig. 1.
(g) Snowball Metrics10 (Colledge 2014) comprises a set of 24 metrics based on agreed
and tested methodologies. It aims to become a standard for institutional
benchmarking. The metrics are categorised into (i) research input metrics:
applications volume—the amount of research grant applications submitted to
external funding bodies and awards volume—the number of awards granted and
available to be spent; (ii) research process metrics: the volume of research income
spent and the total value of contract research; and (iii) research output metrics:
mainly scholarly output, field-weighted citation impact, collaboration impact, and
altmetrics. As shown in Fig. 1, Snowball Metrics reuses altmetrics from Altmet-
ric.com, Plum Analytics and Impactstory.
8
https://www.growkudos.com. Accessed 18 Feb 2016.
9
http://lexiurl.wlv.ac.uk. Accessed 18 Feb 2016.
10
http://www.snowballmetrics.com. Accessed 18 Feb 2016.
123
1122 Scientometrics (2016) 109:1117–1166
Although offline now, some other aggregators were, for example, ReaderMeter11 that was
created in 2010 and estimated the impact of scientific content based on their consumption
by readers on Mendeley. Another example is ScienceCard12 that was based on PLOS ALM
and offered article-level metrics. Other examples are Crowdometer, that was a crowd-
sourced service to classify tweets, and CitedIn (Priem et al. 2012a), that collected alt-
metrics for PubMed articles.
In Table 1, the various features offered by the aggregators mentioned above are presented.
Similar to Wouters and Costas (2012), these features comprise technical functionality like
the availability of an API, the availability of a visualisation of altmetrics data, user
interfaces such as widgets and bookmarklets, and search and filter options; quality features
such as gaming and spam detection, disambiguation, normalisation, data access and
management; as well as other services offered to target audiences and user groups, user
access to the systems, coverage of the metrics, the level of metrics offered, and whether
traditional metrics such as citations are included in order to directly compare impact
measures on the systems. When no information is found, we state N/A for information not
available. Most aggregators cover a wide range of data sources and offer some form of
visualisation of altmetrics, e.g., through widgets, bookmarklets and embedding. As part of
their quality assurance strategy, Altmetric.com uses only those data sources that can be
audited (Adie and Roe 2013). The blogs and news sources are manually curated. Alt-
metric.com is the only aggregator that offers an aggregated score, alongside the metrics,
for the artifacts monitored. All the aggregators cover multiple disciplines and are trans-
parent about what data sources they cover.
In Table 2, the diverse data sources used by the altmetric aggregators are shown. The data
sources used by altmetric aggregators cover both social media data sources as well as
bibliometric data sources, as altmetric aggregators report on bibliometrics as well as alt-
metrics as shown in Table 1. The data sources in Table 2 were collected from the
aggregators’ websites and blogs. When no information is found, we state N/A for infor-
mation not available. Inspired from Priem and Hemminger (2010), the data sources are
classified into several categories: social bookmarking and reference managers; video,
photo and slide sharing services; social networks; blogging; microblogging; recommen-
dation and review systems; Q&A websites and forums; online encyclopaedia; online digital
libraries, repositories and information systems; dataset repositories; source code reposi-
tories; online publishing services; search engines and blog aggregators; and other less
common sources such as policy documents, news sources, specialised services and the
Web in general. A lot of these data sources are described by Kumar and Mishra (2015) and
analysed by Wouters and Costas (2012).
Mendeley is the most popular social media data source covered directly by all aggre-
gators in the table apart from Snowball Metrics. Other popular data sources are Twitter,
YouTube, Wikipedia, Scopus and PLOS. Most of the aggregators however have their own
preference of data sources and only a few are used in common. Those data sources in
11
http://readermeter.org. Accessed 18 Feb 2016.
12
http://50.17.213.175. Accessed 18 Feb 2016.
123
Table 1 Features of altmetric aggregators
Altmetric.com Impactstory Plum analytics PLOS ALM Kudos Webometric Snowball metrics
analyst
API Altmetric API (Deprecated) Open API PLOS ALM API N/A Bing API for N/A
(RESTful), web
PLOS Search searches
API
Visualisation Altmetric N/A Plum Print, PlumX Dashboards, N/A N/A Network N/A
Explorer, PlumX ?Grants, PlumX Visualisation
Altmetric Benchmarks, PlumX Funding
Badges, Opportunities
Scientometrics (2016) 109:1117–1166
Altmetric for
institutions
Widget, Embeddable Impactstory Embeddable PlumX Artifact PLOS ALM Kudos Publication N/A N/A
Embedding, Altmetric profile/ open Widget, (Custom widget builder), Widget plugin widget (embed
Bookmarklet badges, access Artifact pop-up widget,Artifact for WordPress publication
Altmetric badges, summary widget,Artifact details details), Kudos
Bookmarklet Profile widget, Group widget, Researcher Resources widget
based widget, Grant widget
embedding
Search/filter Search and filter N/A Search and filter Search and filter Search and filter Web search, Publication
options filter for output filter
spam and
duplicates
Detection Gaming and N/A N/A Gaming detection N/A Plagiarism and N/A
spam detection spam
detection
with
automatic
spam
removal
Disambiguation Disambiguate N/A Disambiguate links and authors Disambiguate Disambiguate N/A N/A
links authors using authors using
ORCID ORCID
1123
123
Table 1 continued
1124
Altmetric.com Impactstory Plum analytics PLOS ALM Kudos Webometric Snowball metrics
analyst
123
Normalisation Data (not score) N/A N/A N/A N/A N/A Normalised for
normalisation size
Data access and Data download Data Data download and management Data download, N/A Data Data
management and download management, download standardisation
management and standardisation and and cleansing
management and cleansing management
Target Researchers, Researchers Researchers, institutions, publishers, Researchers, Researchers, Researchers Institutions,
audience institutions, funding agencies institutions, institutions, funding
publishers, publishers, publishers, agencies,
funding funding funding agencies metrics for
agencies agencies journals
User access Registration Open access, Subscription based Free access Registration Free access, Free access
required for fees for required, free registration
some products, profiles after access for required
few products trial, sign-up researchers, fees
and trial required for publishers,
versions for funders and
free institutions
Coverage of Multi- Multi- Multi-disciplinary Multi- Multi-disciplinary Multi- Multi-
Metrics disciplinary disciplinary disciplinary disciplinary disciplinary
Level of Article-level Artifact-level Artifact-level metrics Article-level Article-level Artifact-level Artifact-level
Metrics metrics, metrics, metrics metrics metrics metrics
aggregated Researcher
altmetric score level
Bibliometrics N/A Citation Citation counts Citation counts WoS citation N/A Citation counts
counts counts
Social Mendeley, CiteULike, Mendeley, Mendeley, CiteULike, Mendeley, N/A Mendeley N/A
bookmarking/ (Connotea), Delicious CiteULike, Delicious CiteULike,
Reference Delicious (Connotea)
managers
Video, photo and YouTube, Pinterest, Podcasts YouTube, YouTube, Vimeo, N/A N/A YouTube, Flickr, N/A
slide sharing Vimeo, SlideShare DailyMotion
SlideShare
Social networks Facebook, Google?, (LinkedIn) Facebook Facebook, Google? Facebook N/A Facebook, N/A
Scientometrics (2016) 109:1117–1166
Academia.edu,
ResearchGate
Blogging Nature blogs, Scientific N/A Research Blogging, and Research N/A Google blogs N/A
American blogs, PLOS blogs, others Blogging,
and others Google blogs,
Nature,
WordPress,
and others
Microblogging Twitter, Sina Weibo Twitter Twitter Twitter N/A Twitter, Tumblr N/A
Recommendation F1000, Reddit, Publons, Publons Reddit, Goodreads, F1000Prime, N/A N/A N/A
and review PubPeer Amazon reviews Reddit
systems
Q&A and Forums Stack exchange, diverse forums N/A Stack Exchange N/A N/A N/A N/A
Online Wikipedia Wikipedia Wikipedia Wikipedia N/A Wikipedia N/A
Encyclopaedia
Online digital PMC PMC, PubMed, Scopus, PMC, PubMed, WoS WorldCat, Scopus,
libraries/ PubMed, CrossRef, Figshare, Scopus, CrossRef, WorldCat,
Repositories/ Scopus, WorldCat, institutional CrossRef, arXiv institutional
Information CrossRef, repositories, RePEc, Figshare, WoS, repositories,
Systems Figshare, EBSCO, EPrints, Europe PMC, EPrints, dSpace,
arXiv SSRN, dSpace, BioMed WoS, Lexis,
USPTO patents Central CRIS
1125
123
Table 2 continued
1126
Altmetric.com Impactstory Plum analytics PLOS ALM Kudos Webometric Snowball metrics
analyst
123
Dataset N/A Dryad Dryad DataCite, ADS N/A N/A N/A
repositories
Source code N/A GitHub GitHub, SourceForge GitHub, N/A N/A N/A
Repositories Bitbucket
Online publishers Diverse publishers PLOS PLOS PLOS, Open About 50 N/A PLOS
edition, publishers
Copernicus
Search engines, N/A Science Science seeker Google Scholar, N/A Google Books, Google Scholar
Blog seeker Science Bing
aggregators Seeker, Nature
open search
Others Policy documents, research The Web bit.ly, news articles Trackbacks, N/A The Web, Journal metrics,
highlights in nature journals, ORCID, Technorati, WIPO
QS, global news outlets in COUNTER, Google code,
several languages, science Dataone Google patents
and general news, news counter,
articles, manual entries from Dataone usage,
radio and TV curated article
coverage
brackets indicate no longer viable data sources. LinkedIn have closed their data stream and
Connotea has discontinued service as of March 2013. Historical data from LinkedIn and
Connotea are however still available on Altmetric.com. Rarely considered data sources are
for example, Flickr, Technorati, Sina Weibo (a Chinese microblog), CRIS (Current
Research Information System), WIPO (World Intellectual Property Organization), QS
(Quacquarelli Symonds—a global provider of specialist higher education and careers
information and solutions), SSRN (Social Science Research Network), ADS (Astrophysics
Data System), bit.ly (a URL shortening service), COUNTER (Counting Online Usage of
Networked Electronic Resources), policy documents, research highlights, news outlets, and
trackbacks.
Altmetrics are based on events on social media, which are created as a result of actions on
research objects. There are several classifications for events (Bornmann 2014b; Lin and
Fenner 2013a). For example, PLOS ALM has the categories: cited; discussed; viewed, and
saved, while Plum Analytics has the categories: usage; captures; mentions, and social
media citations. Haustein et al. (2016) propose a framework to describe the acts on
research objects. Acts are defined as activities that occur on social media leading to online
events. The three categories of acts have an increasing level of engagement from just
accessing, to appraising, to applying (Haustein et al. 2016).
Table 3 gives an overview of the events which the altmetrics aggregators pull from the
various social media data sources and aggregate to finally provide altmetrics to the end-
users. We collected these events primarily from the listings on the aggregators’ websites
and blogs. We also retrieved further details and events from their APIs and application
source codes (on GitHub). For some of the data sources, we did not find any explicit
information about which events are retrieved nor how these events are aggregated nor
counted. When no information is found, we state N/A for information not available. We
classify the events we found according to the framework from Haustein et al. (2016) and
inspired by PLOS ALM’s and Plum Analytics’ categories. The event categories proposed
here are Usage Events and Capture Events for access acts, Mention Events and Social
Events for appraisal acts, and Citation Events for apply acts. The access events would
encompass PLOS ALM’s viewed and saved categories, and the usage and captures cate-
gories from Plum Analytics. The appraise events would map to PLOS ALM’s discussed
and mention categories and Plum Analytics’ social media citations. Apply events would
correspond to PLOS ALM’s cited and Plum Analyticscitations.
– Access Acts and Events Access Acts are actions that involve accessing and showing
interest in a research object. For scholarly documents, this could be viewing the
metadata of an article such as its title or abstract, or downloading and storing it. For
scholarly agents, access could involve viewing the researcher’s homepage, download-
ing a CV, sending an email, befriending, or following the researcher (Haustein et al.
2016). Access Acts lead to usage events (e.g., views, reads, downloads, link outs,
library holdings, clicks) and to capture events (e.g., saves, bookmarks, tags, favorites,
or code forks). Data sources for Access Acts encompass diverse online platforms and
repositories, reference managers, academic social networking platforms and commu-
nication media like email or messaging (Haustein et al. 2016). In Table 3, the most
common usage events are views and downloads. Nearly all aggregators track Mendeley
123
Table 3 Events tracked by altmetric aggregators
1128
Usage Events Capture Events Mention Events Social Events Citation Events
123
Social Mendeley readers, Mendeley Mendeley Delicious citations N/A N/A
bookmarking/ users, Mendeley groups, bookmarks,
Reference CiteULike readers, (Connotea CiteULike
managers readers) bookmarks,
Delicious
bookmarks
Video, photo and YouTube views, YouTube YouTube YouTube comments, Vimeo YouTube likes, N/A
slide sharing plays, Vimeo plays, favorites, comments, Vimeo forum topic YouTube dislikes,
SlideShare views, SlideShare YouTube counts, SlideShare comments, Vimeo likes
downloads subscribers, YouTube video citations, Pinterest
Vimeo user citations
subscribers,
SlideShare
favorites
Social networks Facebook clicks N/A Facebook comments, Facebook Facebook likes, N/A
public posts, Altmetric Facebook Facebook shares,
public posts, Facebook mentions, Google ?1s
Facebook wall citations, Altmetric
Google? posts, cited by Google?,
cited by Google? users count,
Google? public posts, (cited by
LinkedIn users count), (LinkedIn
public posts)
Blogging N/A N/A Blog posts, counts, and mentions, N/A N/A
Economic blog mentions, Nature
blog discussions, Research
blogging discussions, WordPress
discussions
Microblogging N/A N/A Twitter public comments, cited by Tweets and Re-tweets N/A
tweeters count, Sina Weibo
Scientometrics (2016) 109:1117–1166
mentions
Table 3 continued
Usage Events Capture Events Mention Events Social Events Citation Events
Recommendation Goodreads readers Publons forks PubMed articles reviewed by F1000, F1000Prime scores, N/A
and review F1000 reviews, cited by Reddit F1000Prime
systems users count, Reddit comments, recommendations,
original posts, and discussions, Reddit scores, Reddit
Goodreads reviews, Amazon likes, Publons stars,
reviews, cited by online peer Goodreads ratings,
review sites Amazon ratings
Scientometrics (2016) 109:1117–1166
Q&A and Forums N/A N/A Stack exchange links, Stack N/A N/A
exchange citations, forum citations
Online N/A N/A Wikipedia mentions, Wikipedia N/A N/A
Encyclopaedia discussions, Wikipedia links
Online digital PMC HTML views, PMC N/A PMC reviews, PMC citations by Figshare shares (figures, PMC cited by, citations by
libraries/ HTML/ PDF/ XML reviews tables), Figshare editorials and reviews, Scopus
repositories/ downloads, Figshare recommendations, citations, CrossRef citations,
information views(figures, HTML, tables), Figshare likes WoS citations, SSRN
systems Figshare downloads (figures, citations, USPTO citations,
tables), Figshare supporting RePEc citations, Europe PMC
information file usage stats, citations, DB citations,
WorldCat holdings, BioMed central citations
institutional repository
downloads, RePEc (abstract
views, downloads, usage)
EBSCO views (abstract, full-
text, HTML, PDF, supporting
data), EBSCO (sample
downloads, link outs, exports,
saves), EPrints views
(abstract, PDF), SSRN
downloads, dSpace views
(abstract, PDF)
1129
123
Table 3 continued
1130
Usage Events Capture Events Mention Events Social Events Citation Events
123
Dataset Dryad views, Dryad package N/A N/A N/A DataCite citations
repositories views, Dryad downloads
Source code GitHub downloads, GitHub GitHub forks, GitHub gists, SourceForge reviews GitHub stars, N/A
repositories collaborators GitHub SourceForge ratings,
watchers, SourceForge
GitHub recommendations
followers
Online Publishers PLOS views (abstract, full-text, Click throughs PLOS comments, PLOS notes, PLOS PLOS star ratings N/A
HTML, PDF, supporting data, to the search mentions, Open edition
page, figures, full-text), PLOS publisher site discussions
ALM views (HTML, PDF),
PLOS downloads (PDF,
XML), Copernicus views,
Publisher download counts,
Publication views and full-text
downloads
Search engines, N/A N/A ScienceSeeker discussions N/A Google Scholar citations
Blog
aggregators
Others bit.ly clicks, Linkouts to articles ORCID saves, QS citations, research highlights Altmetric score from N/A
from external websites, Kudos Kudos share citations, MSM citations, policy Altmetric.com
views referrals sources citations, Journal
comments
readers. Impactstory even tracks the percentage of readers by country, discipline and
career stage.
– Appraisal Acts and Events Appraisal Acts involve mention events such as comments,
reviews, mentions, links or discussions in a post. Appraisal Acts also involve social
events such as likes, shares or tweets. Ratings or recommendations could either be
crowdsourced and quantitative like in Reddit, or qualitative (peer) judgements by
experts as on F1000 or PubPeer (Haustein et al. 2016). From Table 3, the most often
collected mention events are comments and reviews.
– Apply Acts and Events Apply Acts are actions that actively use significant parts of,
adapt or transform research objects as a foundation to create new work. These could be
the application or citation of theories, frameworks, methods, or results. Apply Acts
formulate something new by applying knowledge, experience and reputation, such as a
thorough discussion in a blog, slides adapted for a lecture, a modified piece of software,
a dataset used for an evaluation, or a prototype used for commercial pur-
poses (Haustein et al. 2016). Apply Acts may also involve collaborating with others.
The details about the data sources and events retrieved from the aggregators were
scattered across diverse websites and blogs, sometimes even with outdated and conflicting
reports. Therefore, we do not claim that the listings in Tables 1, 2, nor in Table 3 are in any
way exhaustive. When no information was found, we stated N/A (not available). It is also
not clear if F1000Prime is a data source or an aggregator as it does calculate a score based
on the ratings given by the F1000 faculty members.
A systematic literature review of altmetrics literature was conducted with the aim of
answering the following questions:
1. How has literature on altmetrics grown over the years?
2. What research topics on altmetrics have been covered over the years?
3. Which social media data sources have been investigated over the years?
We applied a multi-staged sampling for the data collection for the literature review as
shown in Fig. 2. In the first stage, a search with the search term altmetric* was conducted
on 14 September, 2015 in Scopus to identify the venues having at least 6 articles in the
search results. A total of 13 venues were identified, namely: Scientometrics,13 the Journal
of Informetrics (JOI),14 the Journal of the Association for Information Science and
Technology (JASIST).15 PLOS ONE,16 Proceedings of the ASIS&T Annual Meeting,17
Insights: the UKSG journal,18 Aslib Journal of Information Management,19 PLOS
13
http://link.springer.com/journal/11192. Accessed 18 Feb 2016.
14
http://www.journals.elsevier.com/journal-of-informetrics. Accessed 18 Feb 2016.
15
http://onlinelibrary.wiley.com/journal/10.1002/(ISSN)2330-1643. Accessed 18 Feb 2016.
16
http://plosone.org. Accessed 18 Feb 2016.
17
https://www.asis.org/proceedings.html. Accessed 18 Feb 2016.
18
http://insights.uksg.org/. Accessed 18 Feb 2016.
19
http://www.emeraldgrouppublishing.com/products/journals/journals.htm?id=AJIM. Accessed 18 Feb
2016.
123
1132 Scientometrics (2016) 109:1117–1166
First Stage
Second Stage
- search in Scopus
- with search term altmetric* - full census from each venue
- identify venues with at least 6 - with same search term altmetric*
articles in the search results
Over the last years, there have been several case studies performed with the aim of investi-
gating altmetrics. Figure 4 gives an overview of the research topics published. In the fol-
lowing, representative examples are given for the different research topics investigated.
Diverse research objects have been investigated, ranging from scholarly agents like authors,
scholars, departments, institutions, and countries, to scholarly documents like articles,
reviews, conference papers, editorial materials, letters, notes, abstracts, books, software,
annotations, blogs, blog posts, YouTube videos, and acknowledgements. The main focus of
most of the research over the years has been on cross-metric validation (Bar-Ilan 2014;
Bornmann 2014a, c) and the majority of these were cross-disciplinary studies (Kousha and
Thelwall 2015c; Kraker et al. 2015; Thelwall and Fairclough 2015a; Zahedi et al. 2014a).
The most common method used for cross-metric validation was the calculation of correla-
tions. Most results showed a weak to medium correlation between altmetrics and traditional
bibliometrics (Haustein et al. 2014a; Zahedi et al. 2014a). Another focus was on studies
investigating the validity of data sources (Haustein and Siebenlist 2011; Kousha and Thel-
wall 2015c; Shema et al. 2014; Thelwall and Maflahi 2015a), and on the coverage of alt-
metrics (investigating the amount of research articles for which altmetrics were available
for) (Bornmann 2014c; Haustein et al. 2014a; Zahedi et al. 2014a).
20
http://journals.plos.org/plosbiology/. Accessed 18 Feb 2016.
21
http://www.issi2015.org/en/default.asp. Accessed 18 Feb 2016.
22
http://crln.acrl.org/. Accessed 18 Feb 2016.
23
http://recyt.fecyt.es/index.php/EPI/. Accessed 18 Feb 2016.
24
http://www.nature.com. Accessed 18 Feb 2016.
25
http://ceur-ws.org/. Accessed 18 Feb 2016.
123
Scientometrics (2016) 109:1117–1166 1133
70
Aslib Journal of Information Management
CEUR Workshop Proceedings Till 14.09.2015
College and Research Libraries News
Insights
60
JASIST
Journal of Informetrics
Nature
PLOS Biology
50
PLOS ONE
Proceedings of ISSI
Number of Publications
Scientometrics
30 20
10
0
Normalisation Issues
Number of Publications
15
10
5
0
123
1134 Scientometrics (2016) 109:1117–1166
Most studies concluded that Mendeley (Haustein et al. 2014a; Thelwall and Fairclough
2015a; Zahedi et al. 2014a) and Twitter (Bornmann 2014c; Hammarfelt 2014) have been
the most predominant data sources for altmetrics. There has been a growing interest in
investigating the limitations of altmetrics (Hammarfelt 2014; Zahedi et al. 2014a), as well
as the motivation of researchers using social media (Haustein et al. 2014a; Mas-Bleda
et al. 2014), and the investigation of normalisation methods (Bornmann 2014c; Bornmann
and Marx 2015; Thelwall and Fairclough 2015a). In recent years, attention has also been
given to investigating differences due to country biases (Mas-Bleda et al. 2014; Ortega
2015a; Thelwall and Maflahi 2015a), demographics (Ortega 2015a), gender (Bar-Ilan
2014; Hoffmann et al. 2015), disciplines (Holmberg and Thelwall 2014; Mas-Bleda et al.
2014; Ortega 2015a), and user group differences (Bar-Ilan 2014; Hoffmann et al. 2015;
Mas-Bleda et al. 2014; Ortega 2015a). Only a few articles looked at the visualisation of
altmetrics (Hoffmann et al. 2015; Kraker et al. 2015; Uren and Dadzie 2015) and detecting
gaming or spamming in recent years (Haustein et al. 2015a).
The social media altmetrics data sources investigated were: Mendeley (Zahedi et al.
2014a), CiteULike (Haustein and Siebenlist 2011), Connotea (Haustein and Siebenlist
2011; Yan and Gerstein 2011), BibSonomy (Haustein and Siebenlist 2011), Twitter (Za-
hedi et al. 2014a), F1000/F1000Prime (Bornmann 2014c; Bornmann and Marx 2015),
ResearchGate (Hoffmann et al. 2015), Academia.edu (Mas-Bleda et al. 2014), Linke-
dIn (Mas-Bleda et al. 2014), Facebook (Hammarfelt 2014), YouTube and Podcasts,
ResearchBlogging (Shema et al. 2014) and other blogs, Wikipedia (Zahedi et al. 2014a),
Delicious (Zahedi et al. 2014a), LibraryThing (Hammarfelt 2014), SlideShare (Mas-Bleda
et al. 2014), Amazon Metrics (Kousha and Thelwall 2015c), and WorldCat library hold-
ings (Kousha and Thelwall 2015c). Figure 5 gives an overview of the social media data
sources investigated in studies on altmetrics over the years.
In the last 2 years, there has been a large increase in the number of different social media data
sources considered as interesting for research studies on altmetrics. Recently, Mendeley is the
data source receiving the most interest. The number of studies on Mendeley has nearly doubled
since 2014 as can be seen in Fig. 5. Twitter has received a rather steady amount of interest over
the years, but it now seems the interest is shifting to Mendeley. F1000Prime seems to be
receiving just as much attention as Twitter in 2015 (Bornmann and Marx 2015). Facebook has
also received steady but low attention over the years (Hammarfelt 2014). In recent years, there
have been a few new data sources studied, such as LibraryThing (Hammarfelt 2014), Amazon
and WorldCat library holdings (Kousha and Thelwall 2015c). ResearchGate also seems to be
gaining interest (Hoffmann et al. 2015). Amongst the altmetrics aggregators, Altmetric.com
received the most interest (Bornmann 2014c; Hammarfelt 2014; Loach and Evans 2015;
Maleki 2015b; Peters et al. 2015), but also Impactstory (Maleki 2015b; Peters et al. 2015;
Zahedi et al. 2014a), PlumX (Peters et al. 2015), PLOS ALM (Maleki 2015b), and Webo-
metric Analyst (Kousha and Thelwall 2015a) have been investigated.
The main research investigations on altmetrics over the last 5 years have been on the
coverage of altmetrics and cross-validation studies. The results of these two research topics
are collated and analysed in the following sections.
123
Scientometrics (2016) 109:1117–1166 1135
25
Mendeley
CiteULike
Connotea
BibSonomy
Delicious
20
SlideShare
LibraryThing
YouTube, Podcasts, Vimeo
ResearchGate
Academia.edu
Number of Publications
LinkedIn
15
Google+
Facebook
Twitter
Blogs
F1000 Prime
Wikipedia
10
123
1136 Scientometrics (2016) 109:1117–1166
2014c), software mentions (Howison and Bullard 2015), data citations (Peters et al. 2015),
acknowledgements (Costas and Leeuwen 2012), syllabus mentions (Kousha and Thelwall
2015b), and mainstream media discussions (Haustein et al. 2015b).
In this analysis, the following research questions are investigated:
RQ1.1 What is the overall percentage coverage reported across studies for different data
sources?
RQ1.2 What is the overall mean event count reported across studies for different data
sources?
RQ1.3 What are the overall percentage non-zero coverage and mean non-zero event
count reported across studies for different data sources?
RQ1.4 What is the overall percentage coverage reported across studies for different
disciplines?
In the analysis, we do not consider the coverage of altmetric aggregators, nor do we
consider the coverage of usage metrics such as HTML views, nor downloads. We also do
not consider the coverage of sources of citations, such as WoS, Scopus, PubMed, or
CrossRef. Furthermore, some studies could not be considered in the analysis:
Studies that do not report the actual sample size (e.g., Alperin 2015a; Bar-Ilan 2014;
Haustein and Larivière 2014; Haustein et al. 2014a, b; Peters et al. 2012; Thelwall and
Sud 2015).
Studies that do not report the actual number of articles covered (e.g., Fairclough and
Thelwall 2015; Hammarfelt 2013).
Studies that report on coverage but whose results could not be compared to those from
other studies. For example, Holmberg and Thelwall (2014) report on the coverage of
tweets by researchers and not the coverage of publications. Bornmann (2014c) reports
on unique tweeters mentioning articles rather than on the number of tweets.
When coverage is reported as a break down by discipline (e.g., Costas et al. 2015;
Haustein et al. 2015b; Kousha and Thelwall 2015c; Maleki 2015a; Mohammadi and
Thelwall 2014; Mohammadi et al. 2015a), document type (e.g., Haustein et al. 2015b;
Zahedi et al. 2014a), or gender (e.g., Paul-Hus et al. 2015), only the overall values for all
disciplines, document types, or genders are included in the analysis to answer RQ1.1,
RQ1.2, and RQ1.3.
In response to RQ1.1, Table 4 and Table 5 show the aggregated percentage coverage
across the 25 studies analysed for the 11 aforementioned altmetric data sources. A data
source refers to the social media data source investigated. The total sample size is the sum
of all articles that were considered by the individual studies for the calculation of the
coverage. The total number of articles covered is the number of articles available on the
social media platform that could potentially have altmetric events, some, however have
none. The overall percentage coverage is the percentage of articles covered with respect to
the total sample size. Table 5 shows non-zero coverage in answer to the first part of RQ1.3,
thus articles without events are not considered in these studies.
In Table 4, Mendeley has the highest coverage of 59.2 % across 15 studies. Twitter also
has a medium coverage of 24.3 % across 11 studies. Apart from CiteULike with a cov-
erage of 10.6 % across 8 studies, all other data sources have a low coverage of below 8 %.
Non-zero coverage shown in Table 5 gives slightly different results. F1000 has a coverage
of 100 % in one study, while Mendeley has 40.6 % across 3 studies.
The coverage of altmetric events is shown in Tables 6 and 7, answering RQ1.2 and the
second part of RQ1.3. The total event count is the total number of altmetric events
123
Scientometrics (2016) 109:1117–1166 1137
available for the sample. The average event count is the mean of the altmetric events for
that sample. Different types of events are reported for individual data sources, such as:
Mendeley bookmarks, readers, and readership; Twitter tweets, CiteULike bookmarks; blog
mentions and posts; F1000 reviews, Facebook likes, shares, mentions on walls or pages;
123
1138 Scientometrics (2016) 109:1117–1166
Wikipedia mentions; PubMed HTML views and downloads; and Reddit posts (excluding
comments). Table 6 shows that Mendeley has the highest mean event count of 6.92 across
9 studies, and Table 7 shows that Mendeley also has the highest non-zero mean event
count of 15.8 across 7 studies. All other data sources have a mean event count below 1 in
Table 6, and a mean non-zero event count below 4 in Table 7. The mean non-zero event
counts in Table 7 are all higher than the corresponding mean event counts in Table 6.
In response to RQ1.4, Table 8 gives the aggregated percentage coverage for different
disciplines. The data sources Mendeley, Twitter, CiteULike, Blogs, Facebook, Google?,
Wikipedia, and News have studies that report correlation values for different disciplines.
Based on the most common categories used by the studies analysed, the disciplines are
grouped according to: Biomedical and health sciences, Life and earth sciences, Mathe-
matics and computer science, Natural sciences and engineering, Social sciences and
humanities, and Multidisciplinary.
123
Scientometrics (2016) 109:1117–1166 1139
Mendeley Biomedical and health sciences (Kousha and 4 173,095 122,845 71.0
Thelwall 2015c; Maleki 2015a; Zahedi et al.
2014a; Mohammadi et al. 2015a)
Natural sciences and engineering (Kousha and 4 346,650 121,171 35.0
Thelwall 2015c; Maleki 2015a; Zahedi et al.
2014a; Mohammadi et al. 2015a)
Social sciences and humanities (Hammarfelt 7 107,655 53,942 50.1
2013, 2014; Kousha and Thelwall 2015c;
Maleki 2015a; Zahedi et al. 2014a;
Mohammadi and Thelwall 2014;
Mohammadi et al. 2015a)
Multidisciplinary (Zahedi et al. 2014a) 1 216 172 79.6
Twitter Biomedical and health sciences (Andersen and 5 2,497,965 549,440 22.0
Haustein 2015; Costas et al. 2015; Haustein
et al. 2014b, 2015b; Zahedi et al. 2014a)
Life and earth sciences (Costas et al. 2015; 2 355,103 112,207 31.6
Haustein et al. 2015b)
Mathematics and computer science (Costas 2 187,175 23,147 12.4
et al. 2015; Haustein et al. 2015b)
Natural sciences and engineering (Costas et al. 3 600,776 90,636 15.1
2015; Hammarfelt 2014; Zahedi et al. 2014a)
Social sciences and humanities (Costas et al. 4 207,805 81,354 39.1
2015; Hammarfelt 2014; Haustein et al.
2015b; Zahedi et al. 2014a)
Multidisciplinary (Zahedi et al. 2014a) 1 216 16 7.4
CiteULike Natural sciences and engineering (Haustein and 1 165,801 8127 4.9
Siebenlist 2011)
Social sciences and humanities (Hammarfelt 2 83,392 5247 6.3
2014; Sotudeh et al. 2015)
Blogs Biomedical and health sciences (Costas et al. 2 812,369 20,258 2.5
2015; Haustein et al. 2015b)
Life and earth sciences (Costas et al. 2015; 2 355,103 12,626 3.6
Haustein et al. 2015b)
Mathematics and computer science (Costas 2 187,175 1942 1.0
et al. 2015; Haustein et al. 2015b)
Natural sciences and engineering (Costas et al. 2 585,956 11443 2.0
2015; Haustein et al. 2015b)
Social sciences and humanities (Costas et al. 3 205,144 7241 3.5
2015; Hammarfelt 2014; Haustein et al.
2015b)
Facebook Biomedical and health sciences (Costas et al. 2 812,369 60,456 7.4
2015; Haustein et al. 2015b)
Life and earth sciences (Costas et al. 2015; 2 355,103 19,157 5.4
Haustein et al. 2015b)
Mathematics and computer science (Costas 2 187,175 2873 1.5
et al. 2015; Haustein et al. 2015b)
123
1140 Scientometrics (2016) 109:1117–1166
Table 8 continued
Data Discipline Number Total Total Total
source of sample covered coverage
studies size (%)
From the data collected in the literature review in ‘‘Literature on altmetrics research’’
section, 58 publications were identified as having conducted studies on cross-metric
123
Scientometrics (2016) 109:1117–1166 1141
123
1142 Scientometrics (2016) 109:1117–1166
123
Scientometrics (2016) 109:1117–1166 1143
Table 9 continued
review were identified and included in our meta-analysis. Thus, in total, 68 publications
were identified as having performed cross-metric validation of altmetrics. The studies
however applied diverse statistical methods such as regression (Winter 2015; Bornmann
and Leydesdorff 2015; Bornmann and Haunschild 2015; Bornmann 2015a, 2014c, 2015c),
ANOVA (Allen et al. 2013), Mann-Whitney tests (Shema et al. 2014), Kendall’s Tau (-
Weller and Peters 2012), bivariate correlation (Tang et al. 2012), or Pearson correla-
tion (Costas et al. 2015; Eysenbach 2012; Haustein and Siebenlist 2011; Ringelhan et al.
2015; Shuai et al. 2012; Sotudeh et al. 2015; Thelwall and Wilson 2015a; Waltman and
Costas 2014; Zhou and Zhang 2015; Henning 2010). Some studies did not mention the
method applied (Torres-Salinas and Milanés-Guisado 2014; Peters et al. 2015; Haustein
and Larivière 2014; Bowman 2015; Costas and Leeuwen 2012). The majority of studies,
however, applied Spearman correlation. Therefore, in order to have a consistent method
across all studies in the meta-analysis (Hopkins 2004), only those studies applying
Spearman correlation were considered. Furthermore, some studies did not report the exact
correlation values (Thelwall and Fairclough 2015a; Thelwall and Maflahi 2015b; Thelwall
and Fairclough 2015b; Loach and Evans 2015; Jiang et al. 2013; Haustein and Siebenlist
2011), nor the specific sample size (Bar-Ilan 2014; Cabezas-Clavijo et al. 2013; Chen
et al. 2015; Fausto et al. 2012; Ortega 2015b; Thelwall and Sud 2015), thus these results
could not be considered in our meta-analysis.
The majority of studies in our sample investigated the correlation between altmetrics
and citations from WoS, Scopus and GSC. Thus, the focus of our meta-analysis was on
studies comparing altmetrics with these three citation sources. Therefore, studies com-
paring altmetrics to other metrics like the JIF ( Eyre-Walker and Stoletzki 2013; Li et al.
2012; Haustein et al. 2014b), the JCS (Zahedi et al. 2014a, 2015a), the Immediacy Factor
(IF), and the Eigenfactor (EF) (Haustein et al. 2014b), BCI and Google Books cita-
tions (Kousha and Thelwall 2015c), CrossRef and PMC cites (Liu et al. 2013; Yan and
Gerstein 2011), university rankings (Thelwall and Kousha 2015), and usage metrics like
downloads (Schlögl et al. 2014; Liu et al. 2013; Thelwall and Kousha 2015), or page
views (Liu et al. 2013; Thelwall and Kousha 2015), were not considered in our meta-
analysis.
123
1144 Scientometrics (2016) 109:1117–1166
26
https://www.meta-analysis.com/. Accessed 27 November 2015.
123
Scientometrics (2016) 109:1117–1166 1145
In Table 10, Mendeley has the highest correlation value of 0.37 with citations, whereas
Google? and Delicious have the lowest correlation value with citations of 0.07. Mendeley
also has the highest correlation value of 0.547 with citations on non-zero datasets. Face-
book has the lowest correlation value of 0.109 with citations on non-zero datasets. With
other altmetrics, CiteULike has the highest correlation value of 0.322 and Wikipedia the
lowest correlation with other altmetrics of 0.053. Mendeley had the highest overall cor-
relation value across citations and altmetrics of 0.335, while Delicious had the lowest
overall correlation of 0.064. Further details of the results from the meta-analysis,
answering RQ2.1, RQ2.2, and RQ2.3 are shown in Appendix ‘‘1’’.
In Table 11, the results of cross-metric validation studies are shown for several data
sources, comparing across different disciplines, thus answering RQ2.4. From the relevant
studies considered, four common disciplinary categories could be identified: Biomedical
and life sciences, Social sciences and humanities, Natural sciences and engineering, and
Multidisciplinary. Most of the studies reporting correlation values focused on Biomedical
and life sciences. For Mendeley and CiteULike, Biomedical and life sciences had the
123
1146 Scientometrics (2016) 109:1117–1166
Table 11 continued
123
Scientometrics (2016) 109:1117–1166 1147
Table 11 continued
highest correlation compared to most of the metrics. For Twitter, the highest correlation is
measured for Social sciences and humanities.
Discussion
Overall, the analysis of the results on the coverage of altmetrics show a low coverage
across all metrics and for all disciplines. The results of cross-metric validation studies also
show overall a weak correlation to citations across all disciplines. These studies however
faced numerous challenges and issues as further discussed in ‘‘Challenges and issues’’
section, ranging from challenges in data collection to issues in data integrity. We also faced
some of these issues when compiling the results across the different studies. Confusing and
sometimes contradictory terminology, and no standard definitions for the altmetric events
was the most challenging issue when trying to consolidate the results and perform an
overall analysis across so many studies. Thus the results presented here need to be con-
sidered with some caution due to the many discrepancies amongst the methodologies,
datasets, definitions, and goals of the various studies considered.
Altmetrics is still in its infancy, however, we can already detect a growing importance of
this emergent application area of social media for research evaluation. This paper gives a
compact overview of the key aspects relating to altmetrics. The major aggregators were
analysed according to the features they offer, and the data sources they collect events from.
A snapshot of the research literature on altmetrics shows a steady increase in the number of
research studies and publications on altmetrics since 2011. In particular, the validity of
altmetrics compared to traditional bibliometric citation counts were investigated. Fur-
thermore, a detailed analysis of the coverage of altmetrics data sources was presented as
well as a meta-analysis of the results of cross-metric validation studies. Mendeley has the
highest coverage of about 59 % across 15 studies and the highest correlation value when
compared to citations of 0.37 (and 0.5 on non-zero datasets). Thus, overall, results from the
literature review, coverage analysis and meta-analysis show that presently, Mendeley is the
most interesting and promising social media source for altmetrics, although the data
sources are becoming more and more diverse.
Altmetrics is however still a controversial topic in academia and this is partly due to the
challenges and issues it faces, some of which are listed as follows.
123
1148 Scientometrics (2016) 109:1117–1166
1. Data collection issues Altmetrics are usually collected via social media APIs, for
example, via Mendeley’s, Facebook’s, or Twitter’s API, or scraped from HTML
websites. There are however accessibility issues with certain APIs and restrictions to
the amount of data collectable per day. Thus data collection takes a long time, and
inconsistencies due to delays in data updates can arise (Zahedi et al. 2014b). Finding
the right search queries to use is also an issue as not all research objects (not even all
published research articles) have DOIs. DOIs are also not consistent across the
different registration agencies and tracking and resolving DOIs to URLs can have
complications such as accessibility issues, or difficulties with cookies (Zahedi et al.
2015b). Alternatively, the title or publication date of the publication might be used to
search. This is however dependent on the quality of the metadata from the different
bibliometric sources. These data collection issues are faced by the various altmetrics
aggregators and this results in inconsistencies with the metrics they provide (Zahedi
et al. 2015b).
2. Data processing and disambiguation issues Altmetrics are based on the concept of
tracking mentions of research output to the research objects. Resolving these links to
unique identifiers can be very challenging. There might exist multiple versions of the
same artifact across several sites, using different identifiers (Liu and Adie 2013).
There is also the issue of missing links as some mentions do not include direct links to
artifacts. A solution to this can be achieved by finding different ways to map the
mentions to the articles by computing the semantics involved, also called Semanto-
metrics (Knoth and Herrmannova 2014). Tracking multi-media data sources however
still proves challenging, as most videos or podcasts do not include mentions to articles
in their meta-data, but rather verbally in the audio or video content (Liu and Adie
2013). From Table 3 in ‘‘Events tracked by altmetric aggregators’’ section , we see
that Apply Events are rarely tracked. This might be due to the fact that apply acts are
hard to identify, for example, distinguishing between citations that are mentions and
those that discuss results is very complex (Bornmann 2015b). In addition, some
authors cannot be identified uniquely simply by using their names, and there could be
variations to author names that could make tracking more complex. Some of the
altmetrics aggregators, as shown in Table 1, provide features for disambiguation.
Altmetric.com applies text mining mechanisms to identify missing links to articles,
and disambiguates between different versions of the same output, collating all the
attention into one. PLOS ALM supports author disambiguation and identity resolution
by using ORCID (Open Researcher and Contributor ID) (Haak et al. 2012). Plum
Analytics disambiguates both links to articles and names of authors.
3. No common definition of altmetric events and confusing terminology There are
many different ways by which altmetrics events can be measured from a data
source (Liu and Adie 2013). Table 3 shows the diverse range of altmetrics events
provided by altmetrics aggregators. One challenge is there is no standard definition of
a specific altmetric event, thus aggregators name their events differently, for example,
the number of Mendeley readers of an article is often referred to as Mendeley
readership. In addition, event counts from a single data source could be measured in
different ways, and aggregators do not always explicitly state how the events are
counted. For example, if for a Facebook wall post, the likes and comments on the wall
are counted as well (Liu and Adie 2013), or if re-tweets are counted or only tweets.
This challenge is further compounded with confusing terminology such as the unclear
distinction between usage metrics and altmetrics (Glänzel and Gorraiz 2015).
123
Scientometrics (2016) 109:1117–1166 1149
4. Stability, coverage and usage of social media sources Social media data sources are
liable to change or discontinue their service (Bornmann 2014b). In ‘‘Data sources used
by altmetric aggregators’’ section, some discontinued social media data sources are
mentioned, and in ‘‘The altmetrics landscape’’ section some altmetrics aggregators are
also mentioned as no longer being in service. This fluctuation in the availability of
altmetrics poses a challenge, especially regarding reproducing the evidences for the
event counts. Furthermore, the usage and coverage of social media data sources
depends on various factors such as country, demographics and audiences (Bornmann
2014b; Priem et al. 2014). Some data sources are popular in certain countries, for
example, BibSonomy is popular in Germany (Peters et al. 2012).
5. Data integrity There are many concerns regarding gaming, spamming and plagiarism
in altmetrics. Several research studies have been conducted to investigate the
manipulation of research impact. One such study on automated Twitter accounts
revealed that automated bot accounts created a substantial amount of tweets to
scientific articles and their tweeting criteria are usually random and non-qualita-
tive (Haustein et al. 2015a). In Table 1, we present an overview of the various features
offered by the altmetrics aggregators. Some of them offer novel tools and features that
can help detect suspicious activity. Plum Analytics, Impactstory and PLOS ALM
gather citation metrics as part of their data, which helps users to compare traditional
metrics with altmetrics to see for themselves if there is any correlation between the
two. Altmetric.com, in addition to detecting gaming, also picks up on spam accounts
and excludes them from the final altmetric score. As part of their data integrity
process, PLOS ALM generates alerts from Lagotto in order to determine what may be
going wrong with the application, data sources, and data requests. These alerts are
used to discover potential gaming activities and system operation errors. Webometric
Analyst checks actively for plagiarism and supports automated spam removal by
excluding URLs from suspicious websites. SSRN and PLOS ALM have set up
strategies to ensure data integrity (Gordon et al. 2015). One such system is
DataTrust (Lin 2012), developed by PLOS ALM, which keeps track of suspicious
metrics activity. PLOS also analyses user behaviour and cross validates usage metrics
with other sources in order to detect irregular usage (Gordon et al. 2015). SSRN issues
warnings when fraudulent automatic downloads are detected (Edelman et al. 2009).
These issues listed above underline the need for common standards and best practices,
especially across altmetrics aggregators (Zahedi et al. 2015b). To this aim, NISO (2014)
has started an initiative to formulate standards, propose best practices and develop
guidelines and recommendations for using altmetrics to assess research impact. Topics
include defining a common terminology for altmetrics, developing strategies to ensure data
quality, and the promotion and facilitation of the use of persistent identifiers. Ensuring
consistency and normalisation of altmetrics will also be an important future research
topic (Wouters and Costas 2012), as well as defining a common terminology, theories and
classification of altmetric events (Haustein et al. 2016; Lin and Fenner 2013a).
Altmetrics offer a unique opportunity to analyse the reach of scholarly output in
society (Taylor 2013). In future, network analysis of altmetrics will be needed to study
research interaction and communication (Davis et al. 2015; Priem et al. 2014). Altmetrics
can be used to describe research collaboration amongst scholars, scientists, and authors,
123
1150 Scientometrics (2016) 109:1117–1166
123
Scientometrics (2016) 109:1117–1166 1151
Acknowledgments This research is supported by the National Research Foundation, Prime Minister’s
Office, Singapore under its Science of Research, Innovation and Enterprise programme (SRIE Award No.
NRF2014-NRF-SRIE001-019).
Table 12 Cross-metric validation studies comparing altmetrics to citations from CrossRef, PubMed,
journal based citations, book citations, and university rankings
CrossRef PubMed Journal based citation Book citations University
Rankings
123
1152 Scientometrics (2016) 109:1117–1166
Table 12 continued
ResearchGate Holmberg
(2015),
Thelwall
and
Kousha
(2015)
Facebook Costas et al. (2015), Loach Holmberg
and Evans (2015) and (2015)
Ringelhan et al. (2015)
Google? Costas et al. (2015)
LinkedIn Holmberg
(2015)
F1000 Bornmann (2015a), (Holmberg
Bornmann and (2015)
Leydesdorff (2015), Eyre-
Walker and Stoletzki
(2013), Waltman and
Costas (2014)
Wikipedia Zahedi et al. (2014a) Tang et al.
(2012)
News Outlets Costas et al. (2015) Loach Tang et al.
and Evans (2015) (2012)
Amazon Kousha and
Metrics Thelwall
(2015a), c)
WorldCat Kousha and
Holdings Thelwall
(2015a), c)
Altmetric.com Costas et al. (2015)
Score
123
Scientometrics (2016) 109:1117–1166 1153
Table 13 continued
Facebook
Blogs Yan and
Gerstein
(2011)
WorldCat (Kousha and
Holdings Thelwall
2015c)
Table 14 Cross-metric validation studies comparing Mendeley, CiteULike, Delicious, YouTube and F1000
to other altmetrics
Mendeley CiteULike Delicious YouTube F1000
Table 15 Cross-metric validation studies comparing Twitter, Blogs, Facebook, Google?, and Research-
Gate to other Altmetrics
Twitter Blogs Facebook Google? ResearchGate
123
1154 Scientometrics (2016) 109:1117–1166
Table 15 continued
In the following sections, the details of the results from the meta-analysis, answering
research questions RQ2.1, RQ2.2, and RQ2.3 presented in ‘‘Results of studies on cross-
metric validation’’, are shown. Table 16 gives an overview of the studies covered in the
meta-analysis. The results of the meta-analysis are shown in the following sections
depicted as forest plots. In a forest plot, the study names of the studies considered in each
meta-analysis are listed according to Table 16, including an additional index added to the
name if several results were reported in a single study. For each study, the reported
correlation value, the lower and upper limits, the Z-value and p-value are reported, as well
as the sample size, i.e., the total number of data points (research artifacts) considered in the
study. For each study, the measured correlation is represented by a black square. The size
of the square depicts the study’s weight (according to sample size) in the meta-analysis.
123
Scientometrics (2016) 109:1117–1166 1155
Table 16 continued
The horizontal lines show confidence intervals. The overall measured correlation from the
meta-analysis is shown as a diamond, whose width depicts the confidence interval. When
the studies are grouped, several diamonds are shown, each representing the overall mea-
sured correlation across the group.
Mendeley
Figure 6 shows the results of the meta-analysis for Mendeley compared to citations,
resulting in an overall correlation of 0.37, thus answering RQ2.1: 0.631 with Google
Scholar (Li et al. 2011, 2012; Bar-Ilan 2012), 0.577 with Scopus (Schlögl et al. 2014;
Haustein et al. 2014a; Thelwall and Sud 2015; Maflahi and Thelwall 2015; Li et al. 2012;
Bar-Ilan 2012; Bar-Ilan et al. 2012), and 0.336 with WoS (Zahedi et al. 2014a; Moham-
madi and Thelwall 2014; Mohammadi et al. 2015a; Li et al. 2011; Maleki 2015b; Zahedi
et al. 2015a; Li et al. 2012; Priem et al. 2012b; Bar-Ilan 2012). However in response to
RQ2.2, correlations with citations considering non-zero datasets was overall 0.547: 0.65
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Google Scholar citations 0.631 0.557 0.695 12.776 0.000 3020
Scopus citations 0.595 0.561 0.628 26.031 0.000 632003
WoS citations 0.416 0.398 0.434 39.110 0.000 3592724
123
1156 Scientometrics (2016) 109:1117–1166
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Amazon Metrics KoushaThelwall2015Amazon4 0.062 -0.009 0.133 1.707 0.088 759
Amazon Metrics KoushaThelwall2015Amazon5 0.047 -0.096 0.188 0.643 0.520 190
Amazon Metrics KoushaThelwall2015Amazon6 -0.096 -0.166 -0.025 -2.648 0.008 759
Amazon Metrics KoushaThelwall2015Amazon39 0.003 -0.052 0.058 0.106 0.915 1262
Amazon Metrics KoushaThelwall2015Amazon40 0.004 -0.095 0.103 0.079 0.937 391
Amazon Metrics KoushaThelwall2015Amazon41 -0.033 -0.088 0.022 -1.171 0.241 1262
Amazon Metrics KoushaThelwall2015Amazon74 0.101 0.028 0.173 2.710 0.007 718
Amazon Metrics KoushaThelwall2015Amazon75 0.106 -0.030 0.239 1.523 0.128 208
Amazon Metrics KoushaThelwall2015Amazon76 -0.041 -0.114 0.032 -1.097 0.273 718
Amazon Metrics KoushaThelwall2015Amazon153 0.067 0.013 0.121 2.421 0.015 1305
Amazon Metrics 0.016 -0.025 0.058 0.782 0.434 7572
CiteULike Li2011Validating9 0.586 0.538 0.630 18.875 0.000 793
CiteULike Li2011Validating11 0.605 0.560 0.647 20.037 0.000 820
CiteULike LiThelwall2012F1000_6 0.586 0.550 0.619 25.073 0.000 1397
CiteULike Bar-Ilan2012BeyondCitations2 0.441 0.393 0.487 15.937 0.000 1136
CiteULike 0.557 0.480 0.626 11.630 0.000 4146
Delicious Zahedi2014HowWell4 0.031 0.017 0.045 4.360 0.000 19772
Delicious 0.031 0.017 0.045 4.360 0.000 19772
F1000 LiThelwall2012F1000_5 0.309 0.261 0.356 11.927 0.000 1397
F1000 0.309 0.261 0.356 11.927 0.000 1397
Twitter Zahedi2014HowWell5 0.070 0.056 0.084 9.858 0.000 19772
Twitter 0.070 0.056 0.084 9.858 0.000 19772
Wikipedia Zahedi2014HowWell3 0.083 0.069 0.097 11.697 0.000 19772
Wikipedia 0.083 0.069 0.097 11.697 0.000 19772
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
with Scopus (Thelwall and Wilson 2015b) and 0.543 with WoS (Mohammadi and Thel-
wall 2014; Mohammadi et al. 2015a; Sud and Thelwall 2015).
In response to RQ2.3, Fig. 7 shows the results of the meta-analysis for Mendeley
compared to other altmetrics. The overall correlation was 0.18: 0.016 with Amazon
Metrics (Kousha and Thelwall 2015c), 0.557 with CiteULike (Li et al. 2011, 2012; Bar-
Ilan et al. 2012), 0.031 with Delicious (Zahedi et al. 2014a), 0.309 with F1000 (Li et al.
2012), 0.070 with Twitter (Zahedi et al. 2014a), and 0.083 with Wikipedia (Zahedi et al.
2014a). The overall correlation with citations and altmetrics was 0.335.
Answering RQ2.1, Fig. 8 shows the results of the meta-analysis for Twitter compared to
WoS citations, resulting in an overall correlation of 0.108 (Haustein et al. 2015b, 2014b;
Zahedi et al. 2014a; Priem et al. 2012b; Maleki 2015b). However, correlations with
citations considering non-zero datasets (in response to RQ2.2) was 0.156: 0.392 with
Google Scholar (Eysenbach 2012), 0.229 with Scopus (Eysenbach 2012), and 0.078 with
WoS (Thelwall et al. 2013; Haustein et al. 2015b, 2014b).
Group by Study name Statistics for each study Correlation and 95% CI
Study
Lower Upper
Correlation limit limit Z-Value p-Value Total
Haustein2014TweetingBiomedicine 0.115 0.091 0.138 9.520 0.000 1119167
Haustein2015Characterizing 0.194 0.192 0.196 227.392 0.000 1339279
Maleki2015PubMed 0.072 0.049 0.094 6.265 0.000 7752
Priem2012Altmetrics 0.003 -0.022 0.028 0.221 0.825 6232
Zahedi2014HowWell 0.250 0.237 0.263 35.912 0.000 19772
123
Scientometrics (2016) 109:1117–1166 1157
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Blogs Haustein2015Characterizing2 0.194 0.192 0.196 227.392 0.000 1339279
Blogs 0.194 0.192 0.196 227.392 0.000
Delicious Zahedi2014HowWell12 0.125 0.111 0.139 17.668 0.000 19772
Delicious 0.125 0.111 0.139 17.668 0.000
Facebook Haustein2015Characterizing7 0.320 0.318 0.322 383.806 0.000 1339279
Facebook 0.320 0.318 0.322 383.806 0.000
Google+ Haustein2015Characterizing8 0.142 0.140 0.144 165.451 0.000 1339279
Google+ 0.142 0.140 0.144 165.451 0.000
Mendeley Zahedi2014HowWell5 0.070 0.056 0.084 9.858 0.000 19772
Mendeley 0.070 0.056 0.084 9.858 0.000
News Haustein2015Characterizing9 0.137 0.135 0.139 159.549 0.000 1339279
News 0.137 0.135 0.139 159.549 0.000
Wikipedia Zahedi2014HowWell9 0.056 0.042 0.070 7.882 0.000 19772
Wikipedia 0.056 0.042 0.070 7.882 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
In answer to RQ2.3, the overall correlation with other altmetrics resulted in 0.151 as
shown in Fig. 9: 0.194 with Blogs (Haustein et al. 2015b), 0.125 with Delicious (Zahedi
et al. 2014a), 0.32 with Facebook (Haustein et al. 2015b), 0.142 with Google? (Haustein
et al. 2015b), 0.07 with Mendeley (Zahedi et al. 2014a), 0.137 with News (Haustein et al.
2015b), and 0.056 with Wikipedia (Zahedi et al. 2014a). Finally, the overall correlation
with citations and altmetrics was 0.111.
CiteULike
Figure 10 shows the results of the meta-analysis for CiteULike compared to citations,
resulting in an overall correlation of 0.288: 0.383 with Google Scholar (Li et al.
2012, 2011), 0.257 with Scopus (Li et al. 2012; Liu et al. 2013; Haustein et al. 2015b; Bar-
Ilan et al. 2012), and 0.256 with WoS (Priem et al. 2012b; Li et al. 2012, 2011) in answer
to RQ2.1. As shown in Fig. 11 and in answer to RQ2.3, the overall correlation with other
altmetrics resulted in 0.322: 0.076 with Blogs (Liu et al. 2013), 0.194 with Connotea (Liu
et al. 2013), 0.127 with F1000 (Li et al. 2012), and 0.557 with Mendeley (Li et al. 2011).
Finally, the overall correlation was 0.302 across citations and altmetrics.
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Google Scholar citations Li2011Validating6 0.396 0.336 0.453 11.774 0.000 793
Google Scholar citations Li2011Validating8 0.381 0.321 0.438 11.468 0.000 820
Google Scholar citations LiThelwall2012F1000_9 0.377 0.331 0.421 14.806 0.000 1397
Google Scholar citations 0.383 0.352 0.413 22.116 0.000
Scopus citations Haustein2014Coverage2 0.230 0.174 0.284 7.883 0.000 1136
Scopus citations LiThelwall2012F1000_8 0.346 0.299 0.391 13.474 0.000 1397
Scopus citations Bar-Ilan2012BeyondCitations3 0.232 0.176 0.286 7.954 0.000 1136
Scopus citations Liu2013Correlation46 0.222 0.212 0.232 41.089 0.000 33128
Scopus citations 0.257 0.200 0.312 8.604 0.000
WoS citations Li2011Validating5 0.366 0.304 0.425 10.787 0.000 793
WoS citations Li2011Validating7 0.304 0.241 0.365 8.973 0.000 820
WoS citations LiThelwall2012F1000_10 0.345 0.298 0.390 13.432 0.000 1397
WoS citations Priem2012Altmetrics4 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012Altmetrics5 0.200 0.110 0.286 4.329 0.000 459
WoS citations Priem2012Altmetrics6 0.200 0.054 0.338 2.674 0.007 177
WoS citations 0.256 0.136 0.368 4.118 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
123
1158 Scientometrics (2016) 109:1117–1166
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Blogs Liu2013Correlation116 0.086 0.075 0.097 15.691 0.000 33128
Blogs Liu2013Correlation127 0.014 0.003 0.025 2.548 0.011 33128
Blogs Liu2013Correlation137 0.128 0.117 0.139 23.425 0.000 33128
Blogs 0.076 0.011 0.141 2.279 0.023
Connotea Liu2013Correlation155 0.194 0.184 0.204 35.762 0.000 33128
Connotea 0.194 0.184 0.204 35.762 0.000
F1000 LiThelwall2012F1000_12 0.127 0.075 0.178 4.767 0.000 1397
F1000 0.127 0.075 0.178 4.767 0.000
Mendeley Li2011Validating9 0.586 0.538 0.630 18.875 0.000 793
Mendeley Li2011Validating11 0.605 0.560 0.647 20.037 0.000 820
Mendeley LiThelwall2012F1000_6 0.586 0.550 0.619 25.073 0.000 1397
Mendeley Bar-Ilan2012BeyondCitations2 0.441 0.393 0.487 15.937 0.000 1136
Mendeley 0.557 0.480 0.626 11.630 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
Blogs
Figure 12 shows the results of the meta-analysis for Blogs. The correlation with WoS
citations (Haustein et al. 2015b; Priem et al. 2012b) was 0.117 in answer to RQ2.1.
However, correlations with WoS citations considering non-zero datasets (answering
RQ2.2) was 0.194 (Thelwall et al. 2013; Haustein et al. 2015b).
In answer to RQ2.3, correlation with other altmetrics was 0.14: 0.076 with CiteULi-
ke (Liu et al. 2013), 0.031 with Connotea (Liu et al. 2013), 0.18 with Facebook (Haustein
et al. 2015b), 0.196 with Google? (Haustein et al. 2015b), 0.279 with News (Haustein
et al. 2015b), and 0.194 with Twitter (Haustein et al. 2015b). Overall across citations and
altmetrics the correlation was 0.135.
F1000
Figure 13 shows the results of the meta-analysis for F1000 compared to citations (in
response to RQ2.1), resulting in an overall value of 0.229: 0.18 with Google Scholar (Li
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
CiteULike Liu2013Correlation116 0.086 0.075 0.097 15.691 0.000 33128
CiteULike Liu2013Correlation127 0.014 0.003 0.025 2.548 0.011 33128
CiteULike Liu2013Correlation137 0.128 0.117 0.139 23.425 0.000 33128
CiteULike 0.076 0.011 0.141 2.279 0.023
Connotea Liu2013Correlation138 0.031 0.020 0.042 5.644 0.000 33128
Connotea 0.031 0.020 0.042 5.644 0.000
Facebook Haustein2015Characterizing3 0.180 0.178 0.182 210.603 0.000 1339279
Facebook 0.180 0.178 0.182 210.603 0.000
Google+ Haustein2015Characterizing4 0.196 0.194 0.198 229.799 0.000 1339279
Google+ 0.196 0.194 0.198 229.799 0.000
News Haustein2015Characterizing5 0.279 0.277 0.281 331.671 0.000 1339279
News 0.279 0.277 0.281 331.671 0.000
Twitter Haustein2015Characterizing2 0.194 0.192 0.196 227.392 0.000 1339279
Twitter 0.194 0.192 0.196 227.392 0.000
WoS citations Haustein2015Characterizing1 0.124 0.122 0.126 144.244 0.000 1339279
WoS citations Priem2012Altmetrics19 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012Altmetrics20 0.100 0.009 0.190 2.143 0.032 459
WoS citations Priem2012Altmetrics21 0.200 0.054 0.338 2.674 0.007 177
WoS citations 0.117 0.099 0.136 12.618 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
123
Scientometrics (2016) 109:1117–1166 1159
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Google Scholar citations Eyre-Walker2013Assessment1 0.280 0.256 0.304 21.924 0.000 5811
Google Scholar citations Eyre-Walker2013Assessment2 0.370 0.305 0.432 10.372 0.000 716
Google Scholar citations Eyre-Walker2013Assessment7 0.110 -0.028 0.244 1.562 0.118 203
Google Scholar citations Eyre-Walker2013Assessment8 0.230 0.038 0.405 2.342 0.019 103
Google Scholar citations Eyre-Walker2013Assessment9 -0.089 -0.281 0.109 -0.879 0.379 100
Google Scholar citations Eyre-Walker2013Assessment10 0.150 0.018 0.277 2.221 0.026 219
Google Scholar citations Eyre-Walker2013Assessment11 0.220 0.028 0.397 2.237 0.025 103
Google Scholar citations Eyre-Walker2013Assessment12 -0.057 -0.225 0.114 -0.651 0.515 133
Google Scholar citations Eyre-Walker2013Assessment13 0.043 -0.133 0.216 0.477 0.633 126
Google Scholar citations Eyre-Walker2013Assessment14 0.150 -0.029 0.320 1.642 0.101 121
Google Scholar citations Eyre-Walker2013Assessment15 0.200 0.101 0.295 3.910 0.000 375
Google Scholar citations Eyre-Walker2013Assessment16 0.130 -0.054 0.305 1.390 0.165 116
Google Scholar citations Eyre-Walker2013Assessment17 0.093 0.008 0.177 2.143 0.032 531
Google Scholar citations Eyre-Walker2013Assessment18 0.150 0.047 0.250 2.836 0.005 355
Google Scholar citations LiThelwall2012F1000_15 0.290 0.241 0.337 11.147 0.000 1397
Google Scholar citations 0.171 0.112 0.229 5.603 0.000
Scopus citations Mohammadi2013AssessingF10001 0.383 0.289 0.470 7.452 0.000 344
Scopus citations Mohammadi2013AssessingF10002 0.300 0.221 0.375 7.126 0.000 533
Scopus citations Mohammadi2013AssessingF10005 0.201 0.137 0.264 6.024 0.000 877
Scopus citations LiThelwall2012F1000_14 0.293 0.244 0.340 11.270 0.000 1397
Scopus citations 0.289 0.222 0.353 8.100 0.000
WoS citations Bornmann2015Methods1 0.300 0.292 0.308 69.265 0.000 50082
WoS citations LiThelwall2012F1000_16 0.295 0.246 0.342 11.352 0.000 1397
WoS citations Priem2012AltmetricsWild7 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012AltmetricsWild8 0.200 0.110 0.286 4.329 0.000 459
WoS citations Priem2012AltmetricsWild9 0.300 0.160 0.429 4.083 0.000 177
WoS citations BornmannLeydesdorff2013Validation 0.430 0.275 0.563 5.080 0.000 125
WoS citations 0.264 0.160 0.362 4.870 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
CiteULike LiThelwall2012F1000_12 0.127 0.075 0.178 4.767 0.000 1397
CiteULike 0.127 0.075 0.178 4.767 0.000
Mendeley LiThelwall2012F1000_5 0.309 0.261 0.356 11.927 0.000 1397
Mendeley 0.309 0.261 0.356 11.927 0.000
et al. 2012; Eyre-Walker and Stoletzki 2013), 0.278 with Scopus (Li et al. 2012;
Mohammadi and Thelwall 2013), 0.25 with WoS (Priem et al. 2012b; Li et al. 2012;
Bornmann and Marx 2015; Bornmann and Leydesdorff 2013).30 As shown in Fig. 14 and
in answer to RQ2.3, the overall correlation with altmetrics resulted in 0.22: 0.127 with
CiteULike, and 0.309 with Mendeley (Li et al. 2012). Finally, the overall correlation was
0.219 across citations and altmetrics.
Figure 15 shows the results of the meta-analysis for Facebook across all studies. The
correlation with WoS citations (Haustein et al. 2015b; Priem et al. 2012b) was 0.122, in
answer to RQ2.1. However, correlations with WoS citations considering non-zero datasets
and answering RQ2.2 was 0.109 (Thelwall et al. 2013; Haustein et al. 2015b). The cor-
relation with other altmetrics was 0.202: 0.18 with Blogs, 0.144 with Google?, 0.161 with
30
The correlation value 0.43 considered (Bornmann and Leydesdorff 2013) was not explicitly mentioned in
the study, but was available in the meta-analysis by Bornmann (2015a).
123
1160 Scientometrics (2016) 109:1117–1166
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Blogs Haustein2015Characterizing3 0.180 0.178 0.182 210.603 0.000 1339279
Blogs 0.180 0.178 0.182 210.603 0.000 1339279
Google+ Haustein2015Characterizing11 0.144 0.142 0.146 167.813 0.000 1339279
Google+ 0.144 0.142 0.146 167.813 0.000 1339279
News Haustein2015Characterizing12 0.161 0.159 0.163 187.956 0.000 1339279
News 0.161 0.159 0.163 187.956 0.000 1339279
Twitter Haustein2015Characterizing7 0.320 0.318 0.322 383.806 0.000 1339279
Twitter 0.320 0.318 0.322 383.806 0.000 1339279
WoS citations Haustein2015Characterizing10 0.097 0.095 0.099 112.609 0.000 1339279
WoS citations Priem2012Altmetrics13 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012Altmetrics14 0.200 0.110 0.286 4.329 0.000 459
WoS citations Priem2012Altmetrics15 0.300 0.160 0.429 4.083 0.000 177
WoS citations 0.122 0.085 0.159 6.451 0.000 1345511
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
News, and 0.32 with Twitter (Haustein et al. 2015b), thus answering RQ2.3. The overall
correlation across citations and altmetrics resulted in 0.182.
Google1
Figure 16 shows the results of the meta-analysis for Google?. In response to RQ2.2, the
correlation with WoS citations for non-zero datasets was 0.123 (Thelwall et al. 2013;
Haustein et al. 2015b). Only one study investigated the correlation between Google? and
other altmetrics and citations (Haustein et al. 2015b). From that study, the correlation with
WoS citations (Haustein et al. 2015b) was 0.065, thus answering RQ2.2. The overall
correlation with other altmetrics (answering RQ2.3) was 0.165: 0.196 with Blogs, 0.144
with Facebook, 0.179 with News, and 0.142 with Twitter (Haustein et al. 2015b). The
overall correlation across citations and altmetrics was 0.145.
Wikipedia
Figure 17 shows the results of the meta-analysis for Wikipedia. In answer to RQ2.1, the
correlation with WoS citations (Zahedi et al. 2014a; Priem et al. 2012b) was 0.096.
Overall with other altmetrics it was 0.053, thus answering RQ2.3: 0.021 with Delicious,
0.083 with Mendeley, and 0.056 with Twitter (Zahedi et al. 2014a). The overall correlation
across citations and altmetrics was 0.076.
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Fig. 16 Results of meta-analysis for Google? compared to WoS Citations for Non-Zero Data Samples
123
Scientometrics (2016) 109:1117–1166 1161
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Delicious Zahedi2014HowWell8 0.021 0.007 0.035 2.953 0.003 19772
Delicious 0.021 0.007 0.035 2.953 0.003
Mendeley Zahedi2014HowWell3 0.083 0.069 0.097 11.697 0.000 19772
Mendeley 0.083 0.069 0.097 11.697 0.000
Twitter Zahedi2014HowWell9 0.056 0.042 0.070 7.882 0.000 19772
Twitter 0.056 0.042 0.070 7.882 0.000
WoS citations Zahedi2014HowWell6 0.094 0.080 0.108 13.256 0.000 19772
WoS citations Priem2012Altmetrics10 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012Altmetrics11 0.100 0.009 0.190 2.143 0.032 459
WoS citations Priem2012Altmetrics12 0.200 0.054 0.338 2.674 0.007 177
WoS citations 0.096 0.084 0.108 15.544 0.000
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
Group by Study name Statistics for each study Correlation and 95% CI
Compared Metric
Lower Upper
Correlation limit limit Z-Value p-Value Total
Mendeley Zahedi2014HowWell4 0.031 0.017 0.045 4.360 0.000 19772
Mendeley 0.031 0.017 0.045 4.360 0.000
Twitter Zahedi2014HowWell12 0.125 0.111 0.139 17.668 0.000 19772
Twitter 0.125 0.111 0.139 17.668 0.000
Wikipedia Zahedi2014HowWell8 0.021 0.007 0.035 2.953 0.003 19772
Wikipedia 0.021 0.007 0.035 2.953 0.003
WoS citations Zahedi2014HowWell10 0.011 -0.003 0.025 1.547 0.122 19772
WoS citations Priem2012Altmetrics16 0.100 0.074 0.126 7.504 0.000 5596
WoS citations Priem2012Altmetrics17 0.100 0.009 0.190 2.143 0.032 459
WoS citations Priem2012Altmetrics18 0.100 -0.048 0.244 1.324 0.186 177
WoS citations 0.070 0.002 0.138 2.020 0.043
-1.00 -0.50 0.00 0.50 1.00
Measure differing impact Measure similar impact
Delicious
Figure 18 shows the results of the meta-analysis for Delicious. The correlation with WoS
citations (Zahedi et al. 2014a; Priem et al. 2012b) was 0.07, thus answering RQ2.1.
Overall with other altmetrics it was 0.059 (answering RQ2.3): 0.031 with Mendeley, 0.125
with Twitter, and 0.021 with Wikipedia (Zahedi et al. 2014a). The overall correlation
across citations and altmetrics was 0.064.
References
Adie, E., & Roe, W. (2013). Altmetric: Enriching scholarly content with article-level discussion and metrics.
Learned Publishing, 26(1), 11–17.
Allen, H. G., Stanton, T. R., Di Pietro, F., & Moseley, G. L. (2013). Social media release increases
dissemination of original articles in the clinical pain sciences. PLoS One, 8(7), e68914.
Alperin, J. P. (2015a). Geographic variation in social media metrics: An analysis of Latin American journal
articles. Aslib Journal of Information Management, 67(3), 289–304.
Alperin, J. P. (2015b). Moving beyond counts: A method for surveying Twitter users. http://altmetrics.org/
altmetrics15/alperin/. Accessed 18 Feb 2016.
Andersen, J. P., & Haustein, S. (2015). Influence of study type on Twitter activity for medical research
papers. In Proceedings of the 15th international society of scientometrics and informetrics conference.
123
1162 Scientometrics (2016) 109:1117–1166
Araújo, R. F., Murakami, T. R., De Lara, J. L., & Fausto, S. (2015). Does the global south have altmetrics?
Analyzing a Brazilian LIS journal. In Proceedings of the 15th international society of scientometrics
and informetrics conference, (pp. 111–112).
Bar-Ilan, J. (2012). JASIST@mendeley. In ACM web science conference 2012 workshop.
Bar-Ilan, J. (2014). Astrophysics publications on arXiv, Scopus and Mendeley: a case study. Scientometrics,
100(1), 217–225.
Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations:
Scholars’ visibility on the social Web. arXiv preprint. arXiv:1205.5611.
Bornmann, L. (2014a). Alternative metrics in scientometrics: A meta-analysis of research into three alt-
metrics. Scientometrics, 103(3), 1123–1144.
Bornmann, L. (2014b). Do altmetrics point to the broader impact of research? An overview of benefits and
disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903.
Bornmann, L. (2014c). Validity of altmetrics data for measuring societal impact: A study using data from
atmetric and F1000Prime. Journal of Informetrics, 8(4), 935–950.
Bornmann, L. (2015a). Interrater reliability and convergent validity of F1000Prime peer review. Journal of
the Association for Information Science and Technology, 66(12), 2415–2426.
Bornmann, L. (2015b). Letter to the editor: On the conceptualisation and theorisation of the impact caused
by publications. Scientometrics, 103(3), 1145–1148.
Bornmann, L. (2015c). Usefulness of altmetrics for measuring the broader impact of research. Aslib Journal
of Information Management, 67(3), 305–319.
Bornmann, L., & Haunschild, R. (2015). Which people use which scientific papers? An evaluation of data
from F1000 and Mendeley. Journal of Informetrics, 9(3), 477–487.
Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer
assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291.
Bornmann, L., & Leydesdorff, L. (2015). Does quality and content matter for citedness? A comparison with
para-textual factors and over time. Journal of Informetrics, 9(3), 419–429.
Bornmann, L., & Marx, W. (2015). Methods for the generation of normalized citation impact scores in
bibliometrics: Which method best reflects the judgements of experts? Journal of Informetrics, 9(2),
408–418.
Bowman, T. D. (2015). Tweet or publish: A comparison of 395 professors on Twitter. In Proceedings of the
15th international society of scientometrics and informetrics conference.
Buschman, M., & Michalek, A. (2013). Are alternative metrics still alternative? Bulletin of the American
Society for Information Science and Technology, 39(4), 35–39.
Cabezas-Clavijo, Á., Robinson-Garcı́a, N., Torres-Salinas, D., Jiménez-Contreras, E., Mikulka, T.,
Gumpenberger, C., Wernisch, A., & Gorraiz, J. (2013). Most borrowed is most cited? Library loan
statistics as a proxy for monograph selection in citation indexes. In Proceedings of the 14th interna-
tional society of scientometrics and informetrics conference (Vol. 2, pp. 1237–1252).
Chamberlain, S. (2013). Consuming article-level metrics: Observations and lessons. Information Standards
Quarterly, 25(2), 4–13.
Chen, K., Tang, M., Wang, C., & Hsiang, J. (2015). Exploring alternative metrics of scholarly performance
in the social sciences and humanities in Taiwan. Scientometrics, 102(1), 97–112.
Colledge, L. (2014). Snowball metrics recipe book, 2nd ed. Amsterdam, the Netherlands: Snowball Metrics
program partners.
Costas, R., & van Leeuwen, T. N. (2012). Approaching the ‘‘reward triangle’’: General analysis of the
presence of funding acknowledgments and ‘‘peer interactive communication’’ in scientific publica-
tions. Journal of the American Society for Information Science and Technology, 63(8), 1647–1661.
Costas, R., Zahedi, Z., & Wouters, P. (2015). Do ‘‘altmetrics’’ correlate with citations? Extensive com-
parison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the
Association for Information Science and Technology, 66(10), 2003–2019.
Cronin, B. (2013). The evolving indicator space (iSpace). Journal of the American Society for Information
Science and Technology, 64(8), 1523–1525.
Davis, B., Hulpuş, I., Taylor, M., & Hayes, C. (2015). Challenges and opportunities for detecting and
measuring diffusion of scientific impact across heterogeneous altmetric sources. http://altmetrics.org/
altmetrics15/davis/. Accessed 18 Feb 2016.
De Winter, J. (2015). The relationship between tweets, citations, and article views for PLOS One articles.
Scientometrics, 102(2), 1773–1779.
Edelman, B., Larkin, I., et al. (2009). Demographics, career concerns or social comparison: Who Games
SSRN download counts? Harvard Business School.
Eyre-Walker, A., & Stoletzki, N. (2013). The assessment of science: The relative merits of post-publication
review, the impact factor, and the number of citations. PLoS Biol, 11(10), e1001675.
123
Scientometrics (2016) 109:1117–1166 1163
Eysenbach, G. (2012). Can tweets predict citations? Metrics of social impact based on Twitter and corre-
lation with traditional metrics of scientific impact. Journal of Medical Internet Research, 13(4), e123.
Fairclough, R., & Thelwall, M. (2015). National research impact indicators from Mendeley readers. Journal
of Informetrics, 9(4), 845–859.
Fausto, S., Machado, F. A., Bento, L. F. J., Iamarino, A., Nahas, T. R., & Munger, D. S. (2012). Research
blogging: Indexing and registering the change in science 2.0. PLoS One, 7(12), e50109.
Fenner, M. (2013). What can article-level metrics do for you? PLoS Biol, 11(10), e1001687.
Garcı́a, N. R., Salinas, D. T., Zahedi, Z., & Costas, R. (2014). New data, new possibilities: exploring the
insides of Altmetric.com. El profesional de la información, 23(4), 359–366.
Glänzel, W., & Gorraiz, J. (2015). Usage metrics versus altmetrics: Confusing terminology? Scientometrics,
3(102), 2161–2164.
Gordon, G., Lin, J., Cave, R., & Dandrea, R. (2015). The question of data integrity in article-level metrics.
PLoS Biol, 13(8), e1002161.
Haak, L. L., Fenner, M., Paglione, L., Pentz, E., & Ratner, H. (2012). ORCID: A system to uniquely identify
researchers. Learned Publishing, 25(4), 259–264.
Hammarfelt, B. (2013). An examination of the possibilities that altmetric methods offer in the case of the
humanities. In Proceedings of the 14th international society of scientometrics and informetrics con-
ference, (Vol. 1, pp. 720–727).
Hammarfelt, B. (2014). Using altmetrics for assessing research impact in the humanities. Scientometrics,
101(2), 1419–1430.
Haunschild, R., Stefaner, M., & Bornmann, L. (2015). Who publishes, reads, and cites papers? An analysis
of country information. In Proceedings of the 15th international society of scientometrics and infor-
metrics conference.
Haustein, S., & Larivière, V. (2014). A multidimensional analysis of Aslib proceedings-using everything but
the impact factor. Aslib Journal of Information Management, 66(4), 358–380.
Haustein, S., & Larivière, V. (2015). The use of bibliometrics for assessing research: possibilities, limita-
tions and adverse effects. In Incentives and performance, Springer, pp. 121–139.
Haustein, S., Bowman, T. D., Holmberg, K ., Tsou, A., Sugimoto, C. R., & Larivière, V. (2015a). Tweets as
impact indicators: Examining the implications of automated ‘‘bot’’ accounts on Twitter. Journal of the
Association for Information Science and Technology, 67(1), 232–238.
Haustein, S., Costas, R., & Larivière, V. (2015b). Characterizing social media metrics of scholarly papers:
The effect of document properties and collaboration patterns. PLoS One, 10(3), e0120495.
Haustein, S., Bowman, T. D., & Costas, R. (2016). Interpreting ‘Altmetrics’: Viewing acts on social media
through the lens of citation and social theories. In Cassidy R. Sugimoto (Ed.), Theories of informetrics
and scholarly communication. A Festschrift in honor of Blaise Cronin (pp. 372–406). De Gruyter.
Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, H., & Terliesner, J. (2013). Coverage and adoption of
altmetrics sources in the bibliometric community. In Proceedings of the 14th international society of
scientometrics and informetrics conference (Vol. 1, pp. 468–483).
Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, H., & Terliesner, J. (2014a). Coverage and adoption of
altmetrics sources in the bibliometric community. Scientometrics, 101(2), 1145–1163.
Haustein, S., Peters, I., Sugimoto, C. R., Thelwall, M., & Larivière, V. (2014b). Tweeting biomedicine: An
analysis of tweets and citations in the biomedical literature. Journal of the Association for Information
Science and Technology, 65(4), 656–669.
Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal
of Informetrics, 5(3), 446–457.
Henning, V. (2010). The top 10 journal articles published in 2009 by readership on Mendeley. Mendeley
Blog. http://www.mendeley.com/blog/academic-features/the-top-10-journalarticles-published-in-2009-
by-readership-on-mendeley. Accessed 18 Feb 2016.
Hoffmann, C. P., Lutz, C., & Meckel, M. (2015). A relational altmetric? Network centrality on
ResearchGate as an indicator of scientific impact. Journal of the Association for Information Science
and Technology, 67(4), 765–775.
Holmberg, K. (2015). Online Attention of Universities in Finland: Are the bigger universities bigger online
too? In Proceedings of the 15th international society of scientometrics and informetrics conference.
Holmberg, K., & Thelwall, M. (2014). Disciplinary differences in Twitter scholarly communication.
Scientometrics, 101(2), 1027–1042.
Hopkins, W. G. (2004). An introduction to meta-analysis. Sportscience, 8, 20–24.
Howison, J., & Bullard, J. (2015). Software in the scientific literature: Problems with seeing, finding, and
using software mentioned in the biology literature. Journal of the Association for Information Science
and Technology (in press).
123
1164 Scientometrics (2016) 109:1117–1166
Jiang, J., He, D., & Ni, C. (2013). The correlations between article citation and references’ impact measures:
What can we learn? In Proceedings of the American society for information science and technol-
ogy,(Vol. 50, pp. 1–4). Wiley Subscription Services, Inc., A Wiley Company.
Knoth, P., & Herrmannova, D. (2014). Towards semantometrics: A new semantic similarity based measure
for assessing a research publication’s contribution. D-Lib Magazine, 20(11), 8.
Kousha, K., & Thelwall, M. (2015a). Alternative metrics for book impact assessment: Can choice reviews
be a useful source? In Proceedings of the 15th international society of scientometrics and informetrics
conference.
Kousha, K., & Thelwall, M. (2015b). An automatic method for assessing the teaching impact of books from
online academic syllabi. Journal of the Association for Information Science and Technology (in press).
Kousha, K., & Thelwall, M. (2015c). Can Amazon.com reviews help to assess the wider impacts of books?
Journal of the Association for Information Science and Technology, 67(3), 566–581.
Kraker, P., Schlögl, C., Jack, K., & Lindstaedt, S. (2015). Visualization of co-readership patterns from an
online reference management system. Journal of Informetrics, 9(1), 169–182.
Kumar, S., & Mishra, A. K. (2015). Bibliometrics to altmetrics and its impact on social media. International
Journal of Scientific and Innovative Research Studies, 3(3), 56–65.
Kurtz, M. J., & Henneken, E. A. (2014). Finding and recommending scholarly articles. Beyond biblio-
metrics: harnessing multidimensional indicators of scholarly impact, pp. 243–259.
Li, X., Thelwall, M., & Giustini, D. (2011). Validating online reference managers for scholarly impact
measurement. Scientometrics, 91(2), 461–471.
Li, X., Thelwall, M., & (uk, W. W. L. (2012). F1000, Mendeley and traditional bibliometric indicators. In
Proceedings of the 17th international conference on science and technology indicators, pp. 451–551.
Lin, J. (2012). A case study in anti-gaming mechanisms for altmetrics: PLoS ALMs and datatrust. http://
altmetrics.org/altmetrics12/lin. Accessed 18 Feb 2016.
Lin, J., & Fenner, M. (2013a). Altmetrics in evolution: defining and redefining the ontology of article-level
metrics. Information Standards Quarterly, 25(2), 20.
Lin, J., & Fenner, M. (2013b). The many faces of article-level metrics. Bulletin of the American Society for
Information Science and Technology, 39(4), 27–30.
Liu, C. L., Xu, Y. Q., Wu, H., Chen, S. S., & Guo, J. J. (2013). Correlation and interaction visualization of
altmetric indicators extracted from scholarly social network activities: dimensions and structure.
Journal of Medical Internet Research, 15(11), e259.
Liu, J., & Adie, E. (2013). Five challenges in altmetrics: A toolmaker’s perspective. Bulletin of the American
Society for Information Science and Technology, 39(4), 31–34.
Loach, T. V., & Evans, T. S. (2015). Ranking journals using altmetrics. In Proceedings of the 15th inter-
national society of scientometrics and informetrics conference.
Maflahi, N., & Thelwall, M. (2015). When are readership counts as useful as citation counts? Scopus versus Mendeley
for LIS journals. Journal of the Association for Information Science and Technology, 67(1), 191–199.
Maleki, A. (2015a). Mendeley readership impact of academic articles of Iran. In Proceedings of the 15th
international society of scientometrics and informetrics conference.
Maleki, A. (2015b). PubMed and ArXiv vs. Gold open access: Citation, Mendeley, and Twitter uptake of
academic articles of Iran. In Proceedings of the 15th international society of scientometrics and
informetrics conference.
Mas-Bleda, A., Thelwall, M., Kousha, K., & Aguillo, I. F. (2014). Do highly cited researchers successfully
use the social web? Scientometrics, 101(1), 337–356.
Mayr, P., & Scharnhorst, A. (2015). Scientometrics and information retrieval: weak-links revitalized.
Scientometrics, 102(3), 2193–2199.
Mohammadi, E., & Thelwall, M. (2013). Assessing non-standard article impact using F1000 labels.
Scientometrics, 97(2), 383–395.
Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and
humanities: Research evaluation and knowledge flows. Journal of the Association for Information
Science and Technology, 65(8), 1627–1638.
Mohammadi, E., Thelwall, M., Haustein, S., & Larivière, V. (2015a). Who reads research articles? An
altmetrics analysis of Mendeley user categories. Journal of the Association for Information Science
and Technology, 66(9), 1832–1846.
Mohammadi, E., Thelwall, M., & Kousha, K. (2015b). Can Mendeley bookmarks reflect readership? A
survey of user motivations. Journal of the Association for Information Science and Technology, 67(5),
1198–1209.
NISO (2014). NISO alternative metrics (altmetrics) initiative phase 1 white paper. http://www.niso.org/
apps/group_public/download.php/13809/Altmetrics_project_phase1_white_paper.pdf. Accessed 18
Feb 2016.
123
Scientometrics (2016) 109:1117–1166 1165
Orduña-Malea, E., Ortega, J. L., & Aguillo, I. F. (2014). Influence of language and file type on the web
visibility of top European universities. Aslib Journal of Information Management, 66(1), 96–116.
Ortega, J. L. (2015a). How is an academic social site populated? A demographic study of Google Scholar
citations population. Scientometrics, 104(1), 1–18.
Ortega, J. L. (2015b). Relationship between altmetric and bibliometric indicators across academic social
sites: The case of CSIC’s members. Journal of Informetrics, 9(1), 39–49.
Paul-Hus, A., Sugimoto, C. R., Haustein, S., & Larivière, V. (2015). Is there a gender gap in social media
metrics? In Proceedings of the 15th international society of scientometrics and informetrics confer-
ence, pp. 37–45.
Peters, I., Beutelspacher, L., Maghferat, P., & Terliesner, J. (2012). Scientific bloggers under the altmetric
microscope. In Proceedings of the American society for information science and technology (Vol. 49,
pp. 1–4) Wiley Subscription Services, Inc., A Wiley Company.
Peters, I., Jobmann, A., Hoffmann, C. P., Künne, S., Schmitz, J., & Wollnik-Korn, G. (2014). Altmetrics for
large, multidisciplinary research groups: Comparison of current tools. Bibliometrie-Praxis und For-
schung, 3(1), 1–19.
Peters, I., Kraker, P., Lex, E., Gumpenberger, C., & Gorraiz, J. (2015). Research data explored: Citations
versus altmetrics. in Proceedings of the 15th international society of scientometrics and informetrics
conference.
Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159–159.
Piwowar, H., & Priem, J. (2013). The power of altmetrics on a CV. Bulletin of the American Society for
Information Science and Technology, 39(4), 10–13.
Priem, J. (2014). Altmetrics. Beyond Bibliometrics: harnessing multidimensional indicators of scholarly
impact, pp. 263–287.
Priem, J., & Hemminger, B. M. (2010). Scientometrics 2.0: Toward new metrics of scholarly impact on the
social Web. First Monday, 15(7).
Priem, J., Parra, C., Piwowar, H., & Waagmeester, A. (2012a). Uncovering impacts: CitedIn and total-
impact, two new tools for gathering altmetrics. Paper presented at the iConference 2012.
Priem, J., Piwowar, H. A., & Hemminger, B. M. (2012b). Altmetrics in the wild: Using social media to
explore scholarly impact. arXiv preprint. arXiv:1203.4745
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. http://altmetrics.org/
manifesto. Accessed 18 Feb 2016.
Ringelhan, S., Wollersheim, J., & Welpe, I. M. (2015). I like, I cite? Do Facebook likes predict the impact of
scientific work? PLoS One, 10(8), e0134389.
Schlögl, C., Gorraiz, J., Gumpenberger, C., Jack, K., & Kraker, P. (2014). Comparison of downloads,
citations and readership data for two information systems journals. Scientometrics, 101(2), 1113–1128.
Shema, H., Bar-Ilan, J., & Thelwall, M. (2014). Do blog citations correlate with a higher number of future
citations? Research blogs as a potential source for alternative metrics. Journal of the Association for
Information Science and Technology, 65(5), 1018–1027.
Shema, H., Bar-Ilan, J., & Thelwall, M. (2015). How is research blogged? A content analysis approach.
Journal of the Association for Information Science and Technology, 66(6), 1136–1149.
Shuai, X., Pepe, A., & Bollen, J. (2012). How the scientific community reacts to newly submitted preprints:
Article downloads, Twitter mentions, and citations. PLoS One, 7(11), e47523.
Sotudeh, H., Mazarei, Z., & Mirzabeigi, M. (2015). CiteULike bookmarks are correlated to citations at
journal and author levels in library and information science. Scientometrics, 105(3), 2237–2248.
Sud, P., & Thelwall, M. (2015). Not all international collaboration is beneficial: The Mendeley readership
and citation impact of biochemical research collaboration. Journal of the Association for Information
Science and Technology, 67(8), 1849–1857.
Tang, J., Zhang, J., Yao, L., Li, J., Zhang, L., & Su, Z. (2008). Arnetminer: extraction and mining of
academic social networks. In Proceedings of the 14th ACM SIGKDD international conference on
Knowledge discovery and data mining, ACM, pp. 990–998.
Tang, M.-c., Wang, C.-m., Chen, K.-h., & Hsiang, J. (2012). Exploring alternative cyberbibliometrics for
evaluation of scholarly performance in the social sciences and humanities in Taiwan. In Proceedings of
the American society for information science and technology, (Vol. 49, pp. 1–1). Wiley Subscription
Services, Inc., A Wiley Company.
Taylor, M. (2013). Exploring the boundaries: How altmetrics can expand our vision of scholarly commu-
nication and social impact. Information Standards Quarterly, 25(2), 27–32.
Thelwall, M. (2010). Introduction to LexiURL searcher: A research tool for social scientists. Statistical
cybermetrics research group, University of Wolverhampton. http://lexiurl.wlv.ac.uk. Accessed 18 Feb
2016
123
1166 Scientometrics (2016) 109:1117–1166
Thelwall, M. (2012a). Introduction to webometric analyst 2.0: A research tool for social scientists. Sta-
tistical cybermetrics research group, University of Wolverhampton. http://webometrics.wlv.ac.uk.
Accessed 18 Feb 2016.
Thelwall, M. (2012b). Journal impact evaluation: A webometric perspective. Scientometrics, 92(2),
429–441.
Thelwall, M., & Fairclough, R. (2015a). Geometric journal impact factors correcting for individual highly
cited articles. Journal of Informetrics, 9(2), 263–272.
Thelwall, M., & Fairclough, R. (2015b). The influence of time and discipline on the magnitude of corre-
lations between citation counts and quality scores. Journal of Informetrics, 9(3), 529–541.
Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). do altmetrics work? twitter and ten
other social web services. PLoS One, 8(5), e64841.
Thelwall, M., & Kousha, K. (2014). Academia.edu: Social network or academic network? Journal of the
Association for Information Science and Technology, 65(4), 721–731.
Thelwall, M., & Kousha, K. (2015). ResearchGate: Disseminating, communicating, and measuring schol-
arship? Journal of the Association for Information Science and Technology, 66(5), 876–889.
Thelwall, M., & Maflahi, N. (2015a). Are scholarly articles disproportionately read in their own country? An
analysis of Mendeley readers. Journal of the Association for Information Science and Technology,
66(6), 1124–1135.
Thelwall, M., & Maflahi, N. (2015b). Guideline references and academic citations as evidence of the clinical
value of health research. Journal of the Association for Information Science and Technology, 67(4),
960–966.
Thelwall, M., & Sud, P. (2015). Mendeley readership counts: An investigation of temporal and disciplinary
differences. Journal of the Association for Information Science and Technology (in press).
Thelwall, M., & Wilson, P. (2015a). Does research with statistics have more impact? The citation rank
advantage of structural equation modeling. Journal of the Association for Information Science and
Technology, 67(5), 1233–1244.
Thelwall, M., & Wilson, P. (2015b). Mendeley readership altmetrics for medical articles: An analysis of 45
fields. Journal of the Association for Information Science and Technology (in press).
Torres-Salinas, D., & Milanés-Guisado, Y. (2014). Presencia en redes sociales y altmétricas de los prin-
cipales autores de la revista ‘‘El Profesional de la Información’’. El profesional de la información,
23(3),
Uren, V., & Dadzie, A.-S. (2015). Public science communication on Twitter: A visual analytic approach.
Aslib Journal of Information Management, 67(3), 337–355.
Waltman, L., & Costas, R. (2014). F1000 recommendations as a potential new data source for research
evaluation: A comparison with citations. Journal of the Association for Information Science and
Technology, 65(3), 433–445.
Weller, K. (2015). Social media and altmetrics: An overview of current alternative approaches to measuring
scholarly impact. In Incentives and Performance, (pp. 261–276). Springer.
Weller, K., & Peters, I. (2012). Citations in Web 2.0. Science and the Internet, (pp. 209–222).
Wouters, P., & Costas, R. (2012). Users, narcissism and control: tracking the impact of scholarly publi-
cations in the 21st century. SURFfoundation Utrecht.
Yan, K.-K., & Gerstein, M. (2011). The spread of scientific information: Insights from the web usage
statistics in PLoS article-level metrics. PLoS One, 6(5), 1–7.
Zahedi, Z., Costas, R., & Wouters, P. (2014a). How well developed are altmetrics? A cross-disciplinary
analysis of the presence of ’alternative metrics’ in scientific publications. Scientometrics, 101(2),
1491–1513.
Zahedi, Z., Costas, R., & Wouters, P. (2015a). Do Mendeley readership counts help to filter highly cited
WoS publications better than average citation impact of journals (JCS)? In Proceedings of the 15th
international society of scientometrics and informetrics conference.
Zahedi, Z., Fenner, M., & Costas, R. (2014b). How consistent are altmetrics providers? Study of 1000 PLoS
One publications using the PLoS ALM, Mendeley and Altmetric.com APIs. In altmetrics 14. Workshop
at the web science conference, Bloomington, USA.
Zahedi, Z., Fenner, M., & Costas, R. (2015b). Consistency among altmetrics data provider/aggregators: what
are the challenges? In altmetrics15: 5 years in, what do we know? The 2015 altmetrics workshop,
Amsterdam.
Zhou, Q., & Zhang, C. (2015). Can book reviews be used to evaluate books’ influence? In Proceedings of
the 15th international society of scientometrics and informetrics conference.
Zuccala, A. A., Verleysen, F. T., Cornacchia, R., & Engels, T. C. (2015). Altmetrics for the humanities:
Comparing Goodreads reader ratings with citations to history books. Aslib Journal of Information
Management, 67(3), 320–336.
123