Skip to yearly menu bar Skip to main content


Call for Artifact Evaluations

It's becoming increasingly difficult to reproduce results from systems and ML papers. Voluntarily Artifact Evaluation (AE) was successfully introduced at systems conferences and tournaments (ReQuEST, PPoPP and CGO, and Supercomputing) to validate experimental results by an independent AE Committee, share unified Artifact Appendices, and assign reproducibility badges.

MLSys also promotes reproducibility of experimental results and encourages code and data sharing to help the community quickly validate and compare alternative approaches. Authors of accepted MLSys'23 papers are invited to formally describe supporting material (code, data, models, workflows, results) using the standard Artifact Appendix template and submit it to the Artifact Evaluation process (AE). Note that this submission is voluntary and will not influence the final decision regarding the papers. The point is to help authors validate experimental results from their accepted papers by an independent AE Committee in a collaborative way, and to help readers find articles with available, functional, and validated artifacts! For example, ACM Digital Library already allows one to find papers with available artifacts and reproducible results!

You need to prepare your artifacts and appendix using the guidelines below. You can then submit your paper with artifact appendix via dedicated MLSys AE website before March 17th, 2023. Be sure to include a link to the Github (or similar) page so reviewers can access your code. Your submission will be then reviewed according to the guidelines in the Artifact Evaluation process. Please, do not forget to provide a list of hardware, software, benchmark and data set dependencies in your artifact abstract - this is essential to find appropriate evaluators!

AE is run by a separate committee whose task is to assess how submitted artifacts support the work described in accepted papers based on the standard ACM Artifact Review and Badging policy. Since it may be very time consuming to perform full validation of AI/ML experiments while requiring expensive computational resources, we decided to only validate if submitted artifacts are "available" and "functional" at MLSys'23. However, you still need to provide a small sample data set to test the functionality of your artifact. Thus, depending on evaluation results, camera-ready papers will include the artifact appendix and will receive at most two ACM stamps of approval printed on their first page: Available and Functional.

 

Dates

  • Submissions due Mar 31, 2023 AOE
  • AE result notification: Apr 28, 2023

Links