Artifact Evaluation Track
Authors of accepted research papers are invited to submit an artifact to the ICPE Artifact Track. According to ACM’s “Result and Artifact Review and Badging” policy, an “artifact” is “a digital object that was either created by the authors to be used as part of the study or generated by the experiment itself […] software systems, scripts used to run experiments, input datasets, raw data collected in the experiment, or scripts used to analyze results”. A formal review of such artifacts not only ensures that the “study is repeatable” by the same team, if they are available online then other researchers “can replicate the findings” as well.
In this spirit, the ICPE 2020 Artifacts Track exists to review, promote, share and catalog the research artifacts produced by any of the full papers accepted to the research track. Apart from repeatability and replicability, cataloguing these artifacts also allows reuse by other teams in reproduction or other studies. Artifacts of interest include (but are not limited to):
Tools, libraries or frameworks, which are implementations of systems or algorithms essential for the results described in the associated paper, possibly also useful in other work.
Data or repositories, which are essential for the results described in the associated paper, ideally also useful in other work. The authors must ensure that at the camera ready deadline, the artifacts are generally available from a stable URL or DOI with an archival plan, such as the SPEC RG Zenodo repository (personal page is not sufficient).
If you require an exception from the conditions above, please mail the chairs before submitting.
What do you get out of it?
If your artifact is accepted, it will receive one of the following badges in the text of the paper and in the ACM Digital Library:
Artifacts Evaluated - Functional: The artifacts are complete, well-documented and allow to obtain the same results as the paper.
Artifacts Evaluated - Reusable As above, but the artifacts are of such a high quality that they can be reused as is on other data sets, or for other purposes.
Artifacts Available: For artifacts made permanently available. This will only be awarded in conjunction with one of the Artifacts Evaluated badges.
Regarding archival, all accepted artifacts will be indexed on the conference web site.
How to submit?
Submissions are made via Easychair
Submission deadlines are listed on the important dates page.
To submit an artifact for your accepted ICPE 2020 full research track paper, it is important to keep in mind: a) how accessible you are making your artifact to other researchers, and b) the fact that the ICPE artifact evaluators will have very limited time for making an assessment of each artifact. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time. If you envision difficulties, please provide your artifact in some easily ported form, such as a virtual machine image (https://www.virtualbox.org) or a container image (https://www.docker.com).
Whichever the case, your artifact should be made available as a link to a single archive file using a widely available compressed archive format (preferably zip or tar.gz).
The repository or archive must:
- be self-contained (with the exception of pointers to external tools or libraries, which we will not consider being part of the evaluated artifact, but which we will try to use when evaluating the artifact)
contain an HTML file called index.html and a specification of software and hardware requirements that fully describes the artifact and includes (relative) links to the files (included in the archive) that constitute the artifact: include a getting started guide that should stress the key elements of your artifact and that should enable the reviewers to run, execute or analyze your artifact without any technical difficulty include step-by-step instructions (another section within index.html) on how you propose to evaluate your artifact where appropriate, include descriptions of and links to files (included in the archive) that represent expected outputs
contain the artifact itself, which may include, but is not limited to, source code, executables, data, a virtual machine image, and documents. (please use open formats for documents)
contain the submitted version of your research track paper
optionally, authors are encouraged to submit a link to a short video (maximum 5 minutes) demonstrating the artifact.
To facilitate artifact review, we request the submission of a 1-page PDF to the ICPE submission site that includes a brief summary of the artifact, the link to the artifact, as well as the specification of software and hardware requirements.
Review Process and Selection Criteria
The artifact will be evaluated in relation to the expectations set by the paper. Although reviewers will have access to your paper (via your repository or archive), please make very clear how they can run your artifact or analyze your data set to replicate your study, without them having to hunt for this. Reviewers may try to tweak provided inputs and create new ones, to test the limits of the system.
Submitted artifacts will go through a two-phase evaluation:
Kicking the tires: reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (corrupted or missing files, VM does not start, immediate crashes on the simplest example). Authors are informed of the outcome and, in case of technical problems, they can help solve them during a brief author response period.
Artifact assessment: reviewers evaluate the artifacts, checking if they live up to the expectations created by the paper.
Since portability bugs are easy to make, the review committee can issue additional requests to authors during the assessment to fix such bugs in the artifact. The resulting version of the artifact should be considered “final” and should allow reviewers to decide about artifact acceptance and badges.
Artifacts will be scored using the following criteria:
Artifacts Evaluated - Functional:
Documented: Is it accompanied by relevant documentation making it easy to use?
Consistent: Is the artifact relevant to the associated paper, and contribute in some inherent way to the generation of its main results?
Complete: To the extent possible, are all components relevant to the paper in question included? (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis.)
Exercisable: If the artifact is executable, is it easy to download, install, or execute? Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data canaccessed and appropriately manipulated.
Artifacts Evaluated - Reusable:
- The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
Artifacts Available:
- Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier of the object is provided.
Artifact Evaluation Chairs
Andre van Hoorn, University of Stuttgart, Germany
Simona Bernardi, University of Zaragoza, Spain
Evaluation Committee
André Bauer, University of Würzburg, Germany
Jinfu Chen, Concordia University, Canada
Daniele Di Pompeo, Università dell’ Aquila, Italy
Holger Eichelberger, University of Hildesheim, Germany
Matthew Forshaw, Newcastle University, UK
Abel Gómez Llana, Universitat Oberta de Catalunya, Spain
Elena Gómez-Martínez, Universidad Autónoma de Madrid, Spain
Vojtech Horky, Charles University, Czech Republic
Emilio Incerto, Gran Sasso Science Institute, Italy
Shady Issa, University of Lisbon, Portugal
Colin Paterson, University of York, UK
Alessandro Pellegrini, Sapienza University of Rome, Italy
Diego Perez Palacin, Linnaeus University, Sweden
Jose Ignacio Requeno, INP - Ensimag, Grenoble, France
Alexandru Uta, Vrije Universiteit Amsterdam, Netherlands
Simon Eismann, University of Würzburg, Germany