Welcome to the ADMT Publication Server

Artifact Evaluation: FAD or Real News?

DocUID: 2018-005 Full Text: PDF

Author: Bruce R. Childers, Panos K. Chrysanthis

Abstract: Data Management (DM), like many areas of computer science (CS), relies on empirical evaluation that uses software, data sets and benchmarks to evaluate new ideas and compare with past innovation. Despite the importance of these artifacts and associated information about experimental evaluations, few researchers make these available in a findable, accessible, interoperable and reusable (FAIR) manner, in this way hindering the scientific process by limiting open collaboration, credibility of published outcomes, and research progress. Fortunately, this problem is recognized and many CS communities, including the DM one, are advocating and providing incentives for software and analysis papers to follow FAIR principles and be treated equally to traditional publications. Some ACM/IEEE conferences adopted Artifact Evaluation (AE) to reward authors for doing a great job in conducting experiments with FAIR software and data. After half a decade since AE’s inception, the question is whether the emerging emphasis on artifacts, is having a real impact in CS research.

Keywords: Reproducibility, FAIR principles

Published In: 34th IEEE International Conference on Data Engineering

Pages: 1664-1665

Place Published: Paris, France

Year Published: 2018

Note: Lighting Talk

Project: Artifact Evaluation Subject Area: Data Dissemination

Publication Type: Conference Paper

Sponsor: NSF CBET-1609120, Others

Citation:Text Latex BibTex XML Bruce R. Childers, and Panos K. Chrysanthis. Artifact Evaluation: FAD or Real News?. 34th IEEE International Conference on Data Engineering. 1664-1665. 2018. Paris, France. (Note: Lighting Talk).

Similar Publications: