Dave Kochalko of ARTiFACTS.ai takes us through key presentations from the STI 2018 conference that clearly demonstrate the littering of crevasses in citation metrics—knowledge gaps, research not counted, or biases not accounted for. The blogpost points to how scientometrics research, and new technologies like blockchain can be the foundations for a long overdue reworking of science metrics.

I had the opportunity of presenting at the recent 23rd International Conference on Science and Technology Indicators (STI 2018). The conference theme of “Science, Technology and Innovation indicators in transition” delivered a diverse mix of insights, conclusions, and implications that will be important for science and the scientometrics community going forward.

Organized in collaboration with the European Network of Indicator Developers (ENID), and hosted by the Centre for Science and Technology Studies (CWTS) at Leiden University, nearly 160 papers and 35 posters were presented to over 300 attendees. Quite a draw and lots to choose from with some “Wow!” moments as well.

Focusing on the “science of science” as it does, scientometrics is well-positioned to provide analysis and insight into what’s working well in science and what’s not. Highlights from the most valuable talks are:

  • Beware the “study limitations” caveat in bibliometric research: The selective indexing policies of citation databases locks-in an inherent and significant bias (as much as 50%) into bibliometric research. Yet, analyses based on such narrowly constructed datasets typically draw broad and sweeping conclusions without accounting for the absence of underlying data (keynote by Cameron Neylon, Curtin University, Australia).
  • Citations, citations everywhere, nor any fit for my research: To paraphrase from Coleridge’s ‘The Rime of the Ancient Mariner’ a surprising result from a survey of 13,000 researchers finds that 60% of the citations contained in published articles were confirmed by the author(s) to have no influence on their research! These results were consistent across all disciplines and begs the question of how to identify the meaningful citations, something not yet addressed by the authors of this study (presentation by E. Duede, University of Chicago).
  • Altmetric indicators settling in as metrics “of interest”: Hailed by some as the replacement for citations as the currency of science when they were introduced, bibliometric research suggests they serve merely as useful, indications of interest. Nice to have as early facilitators of communication and worthy of further study, perhaps, but not suitable for research evaluation (P. Wouters, et. al.).
  • Short-term bias in research funding decisions exposed: An analysis of research funding in the U.S. based on measuring the degree of “novelty” in proposals illustrates the impact risk-averse evaluation processes have on skewing funding to short-term goals, to the detriment of basic research (keynote by Paula Stephan from Georgia State University) (Wang, Veugelers, and Stephan 2017).

So, my take-aways:

  • Using citation databases today for assessment and policy decisions is akin to navigating with a map that provides only a patchy and incomplete view of the landscape. We are long overdue in addressing the limited and biased view of scientific contribution and scholarship which current citation databases impose.
  • Citations in their current form and indexes that consider primarily only papers as an output are insufficient for understanding the contribution and influence researchers have on their discipline.
  • Funding policies and the quality of decision-making would be enhanced significantly by sharpening our insight into the actual research being conducted.

Here’s where ARTiFACTS has an important role to play. In an era where we have Web 3.0 technologies such as artificial intelligence and distributed computing at our disposal, ARTiFACTS is positioned to bring improvements for science and scientometrics in the form of: 1) greater visibility of finding and researchers, in near real-time by indexing all types of research outputs as they are created and well before results may ultimately be reported in publications; 2) provenance, with clearer digital trails, stronger and more confident signal intelligence; and 3) a community-curated index that will address the coverage gaps and ambiguity in current offerings.

There’s much more to say on this, which perhaps we’ll take up in a subsequent posting, but for now you may find more information in the paper I presented at the conference “Applying Blockchain Solutions to Address Research Reproducibility and Enable Scientometric Analysis” (Kochalko, Morris, and Rollins 2018) and our website www.artifacts.ai. And for anyone who will be in London, consider attending the Outsell Signature Event, Oct 3-4. ARTiFACTS is in Outsell’s top 50 companies to watch and 1 of the 10 selected companies to present at this event. So please look us up and listen to our talk.

References

‘23rd International Conference on Science and Technology Indicators’. STI 2018, 2018. http://sti2018.cwts.nl/.

 ‘European Network of Indicator Designers’. Accessed 28 September 2018. http://www.forschungsinfo.de/ENID/.

‘CWTS – Centre for Science and Technology Studies – Leiden University’. CWTS. Accessed 28 September 2018. https://www.cwts.nl/.

‘STI 2018: Proceedings’. STI 2018, 2018. http://sti2018.cwts.nl/proceedings/.

Cameron Neylon. ‘Open Science Needs Open Indicators’. Science, 06:58:46 UTC. https://www.slideshare.net/CameronNeylon/open-science-needs-open-indicators.

M, Teplitskiy, Duede E, Menietti M, and Lakhani K. ‘Why (Almost) Everything We Know About Citations Is Wrong: Evidence from Authors’. Article in monograph or in proceedings. 1488, 11 September 2018. https://openaccess.leidenuniv.nl/handle/1887/65227.

P, Wouters, Zahedi Z, and Costas R. ‘Social Media Metrics for New Research Evaluation’. Article in monograph or in proceedings. 1133, 11 September 2018. https://openaccess.leidenuniv.nl/handle/1887/65345.

Wang, Jian, Reinhilde Veugelers, and Paula Stephan. ‘Bias against Novelty in Science: A Cautionary Tale for Users of Bibliometric Indicators’. Research Policy 46, no. 8 (1 October 2017): 1416–36. https://doi.org/10/gb22vw.

‘ARTiFACTS – A Blockchain Platform for Scientific & Academic Research’. ARTiFACTS. Accessed 28 September 2018. https://artifacts.ai/.

Kochalko, D, C Morris, and J Rollins. ‘Applying Blockchain Solutions to Address Research Reproducibility and Enable Scientometric Analysis’. Article in monograph or in proceedings. 395, 11 September 2018. https://openaccess.leidenuniv.nl/handle/1887/65349.

‘Outsell Signature Event 2018 | Outsell Inc.’, 2018. https://www.outsellinc.com/outsell-signature-event-2018/.

Disclosure

Dave Kochalko is Co-founder, Chief Academic Officer of ArtiFACTS.

DOI: 10.25815/R55E-9S28

Citation format: The Chicago Manual of Style, 17th Edition

Kochalko, Dave. ‘Citation Indicators in Transition’, 2018. https://doi.org/10.25815/R55E-9S28