Are so-called ‘alternative’ metrics documenting attention to outputs from publishers, access types, funders, insitutitons and countries usually invisible via traditional citation metrics? Another way to put it is, can altmetrics contribute to higher visibility of outputs usually excluded from mainstream metrication?
In this blog post I share some data resulting from text analysis I conducted on some of the metadata included in the complete Altmetric 2018 raw dataset.
The Altmetric Top 100 is an annual list of the research that has received most attention online on the platforms / services that Altmetric monitors each year. Altmetric has released an annual Top 100 list since 2013.
Over time Altmetric has enriched the metadata they shared, also making the raw data available openly on Fighsare. This is very welcome as in the past we had to request the data directly and or do our analysis of the data to detect, for example, outputs’ access type, subjects, funders or institutional and country affiliations. This essential information is now provided in the dataset they share.
You can verify some of these counts by comparing them with the counts offered by Altmetric through their Top 100 2018 interface. The raw dataset includes 212 outputs. Please note metadata count totals do not always sum 212 as beyond the 100 presence of metadata is variable in the raw dataset.
Usual limitations apply: raw data may need refining and deduplication, and counts may have been affected by disiambiguated metadata (e.g. randomised vs randomized) in the original dataset. All counts require further discussion, which -should I find time- I could add in the future.
‘Cirrus’ Cloud of Top 100 Keywords in 212 Output Titles
Top 100 Keywords in 212 Output Titles
Publishers by Output Count in Raw Dataset (212 Outputs)
|American Association for the Advancement of Science||31|
|American Public Health Association||16|
|Massachusetts Medical Society||15|
|United States National Academy of Sciences||15|
|American Heart Association||2|
|Public Library of Science||2|
|Alliance for Academic Internal Medicine||1|
|American Economic Association||1|
|Canadian Science Publishing||1|
|Cold Spring Harbor Laboratory Press||1|
|Oxford University Press||1|
|Taylor & Francis Group||1|
Subjects by Output Count in Raw Dataset (212 Outputs)
|Medical & Health Sciences||44|
|Earth & Environmental Sciences||17|
|Studies in Human Society||11|
|History & Archaeology||7|
|Research & Reproducibility||4|
|Information & Computer Sciences||2|
Countries of First Author Affiliation (where there were both single and several country affiliations in output byline) by Output in Raw Dataset (where metadata was available)
|Country of First Author; All||Output Count|
|No country data in affil; Spain; United States||1|
There Be Dragons
Outputs with single country (non international) author affiliation in Raw Dataset (where metadata was available)
|Single country author affiliation
Access Types in Raw Dataset (where metadata was available)
|Access type||Output Count|
|Free to read||13|
Access Types in Top 100 Outputs
|Access Type||Output Count|
|Free to read||13|
- No outputs in the Arts and Humanities proper included in the dataset- even those in the History & Archaeology subject category (7 outputs) were published in STEM venues.
- Springer Nature dominates the list even above Elsevier: is this because of Altmetric’s connection with the Nature Publishing Group? <– Stacy Konkiel from Altmetric responds: "Definitely not :) Our systems aren't preferential to NPG pubs/journals–they're agnostic. Why does NPG dominate? Hard to say!" (2018, Dec 12)
- The United States continues to dominate author country affiliations in both single author bylines and international multiple author bylines, followed at a distance by the UK.
- Brazil is the only South American country with First Author country affiliation in the raw dataset.
- South Africa is the only African country with an affiliation in the raw dataset.
- Stacy Konkiel from Altmetric is right to clarify that “the ‘countries’ aren’t just first author countries, they are for all authors associated with T100 papers (beyond the first 100, coverage is spottier, as we had less reason to enrich and check it manually)” and that “also worth looking into is the preponderance of papers in the T100 by a few of the same teams. I was genuinely surprised to see how similar they were, and given that, that they had enough attention to all make it into the T100.” (2018, Dec 12; thread)
- Not Open Access (including ‘Free to Read’, which is not Open Access) dominates the access type in both the top 100 and the complete raw dataset.
Engineering, Altmetric (2018). 2018 Altmetric Top 100 – dataset. figshare. Dataset. https://doi.org/10.6084/m9.figshare.7441304.v1
Konkiel, Stacy [skonkiel]. (2018, Dec 12). @ernestopriego “Springer Nature dominates the list…is this because of Altmetric’s connection with the Nature… https://t.co/jsA8P4yeez [Tweet]. Retrieved from https://twitter.com/skonkiel/status/1072944698807988225
Konkiel, Stacy [skonkiel]. (2018, Dec 12). @ernestopriego also worth noting is that the ‘countries’ aren’t just first author countries, they are for all autho… https://t.co/K5Z6pYhxjm [Tweet]. Retrieved from https://twitter.com/skonkiel/status/1072945115805704193
Konkiel, Stacy [skonkiel]. (2018, Dec 12). @ernestopriego Also worth looking into is the preponderance of papers in the T100 by a few of the same teams. I was… https://t.co/DvYNDkJRYz [Tweet]. Retrieved from https://twitter.com/skonkiel/status/1072946012388499457