In the era of data-driven and open science, reliable and comprehensive assessment of scientific impact and merit is now more valuable than ever. The AIMinScience workshop aims to bring academics, researchers, and other professionals from diverse fields together to share ideas and experiences about research assessment, relevant technologies and applications.
In the last decades, the growth rate of scientific articles and related research objects (e.g., data sets, software packages) has been increasing, a trend that is expected to continue. This is not only due to the increase in the number of researchers worldwide, but also due to the growing competition that pressures them to continuously produce publishable results, a trend widely known as “publish or perish”. This trend has also been correlated with a significant drop in the average quality of published research. In this context, reliable and comprehensive metrics and indicators of the scientific impact and merit of publications, data sets, research institutions, individual researchers, and other relevant entities are now more valuable than ever. Scientific impact refers to the attention a research work receives inside its respective and related disciplines, the social/mass media etc. Scientific merit, on the other hand, is relevant to quality aspects of the work (e.g., its novelty, reproducibility, FAIR-ness, readability). It is evident that any impact or merit metrics and indicators rely on the availability of publication-related (meta)data (e.g., abstracts, citations) which, until recently, were restricted inside the data silos of publishers and research institutions. However today, due to the growing popularity of Open Science initiatives, a large amount of useful science-related data sets have been made openly available, paving the way for more sophisticated scientific impact and merit indicators (and, consequently, more precise research assessment).
Topics of interest include (but are not limited to):
Scientometrics & bibliometrics.
Indicators & metrics of any kind (e.g., citation-based, altmetrics) to assess scientific impact or merit for publications, other research objects (e.g., software, data sets), or relevant entities (e.g., individual researchers, institutions, projects, funding organisations).
Improved/scalable/efficient approaches to preprocess, analyse, and mine big scholarly data to facilitate research assessment.
Insightful visualisation techniques that utilise or facilitate research assessment.
Applications utilising scientific impact & merit to provide useful services to the research community and the industry (e.g., recommendation systems, ranking mechanisms, etc).
Data integration/creation/cleaning/publishing projects to facilitate research assessment.
Data mining & machine learning approaches to facilitate research assessment.
Studies regarding the characteristics or the evolution of scientific impact or merit and/or their connection with other measures or phenomena.
DEADLINE extension: Due to the current situation regarding the ongoing coronavirus outbreak that affects the work of many researchers in these days, we have extended the submission deadline of the workshop:
April 4, 2020 April 30, 2020 (23:59, AoE timezone)
Notification of acceptance:
May 5, 2020 May 27, 2020
Camera ready due: June 5, 2020
AIMinScience 2020 submission site is already open here!
The workshop solicits regular papers (up to 12 pages including references & appendices, LNCS format) and short papers (up to 6 pages including references & appendices). The former should be descriptions of complete, original research/technical work. The latter should demonstrate interesting new software or data sets or describe interesting, innovative ideas, which nevertheless require more work to mature. Submissions will be peer-reviewed by program committee members. In order to ensure transparent evaluation and reproducibility of research, submissions whose results are based on research data or research software are required to refer/cite the original data/software as properly deposited in a data/software repository (e.g. Zenodo, Figshare, B2SHARE).
All submissions should be in English and submitted as a PDF file. Authors should consult Springer’s authors’ guidelines and use thir proceedings templates, either for LaTeX or Word, for the preparation of their papers. Inclusion of papers in the program proceedings is conditional upon registration of at least one author per paper.
Workshop proceedings will be published in the Springer CCIS series. Articles will be assigned a DOI, be associated with ORCID IDs and funding projects if any, for reporting to the European Commission or other funders.
**Best papers will be invited for submission to a special issue of the Quantitative Science Studies (QSS) open access journal launched by the International Society for Scientometrics and Informetrics (ISSI). Read the inspirational story of the creation of this journal here.**
The European OpenAIRE infrastructure tries to facilitate, foster, support, and monitor Open Science scholarly communication in Europe. In this context, OpenAIRE populates a research graph whose objects are scientific results (e.g., papers, softwares, datasets), organizations, funders, communities, and data sources. This graph contains rich information, harvested and integrated from multiple sources, that can be used for scientific impact and merit assessment. A dedicated challenge cup having the objective to exploit the knowledge captured by the OpenAIRE Research Graph to achieve a predetermined set of challenges will be organized along with the AIMinScience workshop. A relevant session, that will present the work of the best performing teams, will take place during the workshop. The challenges of the cup will be announced soon.
(Titles to be finalised)
"Beyond the impact factor: possibilities of scientometrics to understand science and society", Dr. Rodrigo Costas
"Quantifying the biases of scientific success", Prof. Roberta Sinatra
"Predicting the future evolution of scientific output", Prof. Yannis Manolopoulos
Co-chairs for AIMinScience 2020 (alphabetically):
Dr. Paolo Manghi (full-time Researcher at ISTI-CNR, Italy & CTO of the OpenAIRE infrastructure)
Dr. Dimitris Sacharidis (assistant professor at TU Wien, Austria)
Dr. Thanasis Vergoulis (scientific associate at “Athena” RC,Greece)
Alessia Bardi (ISTI-CNR, IT)
Nikos Bikakis (Atypon Inc., GR)
Lutz Bornmann (Max Planck Society, DE)
Guillaume Cabanac (Univ. of Toulouse, FR)
Rodrigo Costas (Leiden Univ., NL)
Christos Giatsidis (LIX, École Polytechnique, FR)
John P. A. Ioannidis (Medicine - Stanford Prevention Research Center, USA)
Adam Jatowt (Kyoto Univ., JP)
Ilias Kanellos (ATHENA RC, GR)
Georgia Koutrika (ATHENA RC, GR)
Anastasia Krithara (NCRC Democritos, GR)
Andrea Mannocci (ISTI-CNR, IT)
Yannis Manolopoulos (Aristotle Univ. of Thessaloniki, GR)
Giannis Nikolentzos (LIX, École Polytechnique, FR)
Paraskevi Raftopoulou (Univ. of the Peloponnese, GR)
Maria Jose Rementeria (BSC, ES)
Angelo A. Salatino (The Open University, UK)
Roberta Sinatra (IT University of Copenhagen, DK)
Cassidy R. Sugimoto (Indiana Univ. Bloomington, USA)
Christos Tryfonopoulos (Univ. of the Peloponnese, GR)
Giannis Tsakonas (Univ. of Patras, GR)
Ludo Waltman (Leiden Univ., NL)
Xiaoying Wu, Dimitri Theodoratos, Dimitrios Skoutas and Michael Lan: "Exploring Citation Networks with Hybrid Tree Pattern Queries"
Serafeim Chatzopoulos, Thanasis Vergoulis, Ilias Kanellos, Theodore Dalamagas and Christos Tryfonopoulos: "ArtSim: Improved estimation of current impact for recent articles"
Pantelis Chronis, Dimitris Skoutas, Spiros Athanasiou and Spiros Skiadopoulos: "Link Prediction in Bibliographic Networks"
George Papastefanatos, Elli Papadopoulou, Marios Meimaris, Antonis Lempesis, Stefania Martziou, Paolo Manghi and Natalia Manola: "Open Science Observatory: Monitoring Open Science in Europe"
Georgios Stoupas, Antonis Sidiropoulos, Dimitrios Katsaros and Yannis Manolopoulos: "Skyline-based University Rankings"