Currently we accept submissions by email only to Note that the results are evaluated on a semi-regular basis (once every few months).


The results must be submitted as a .csv file with two columns. The first column must contain the predicted proliferation scores based on mitosis counting (task 1) and the second column must contain the predicted molecular proliferation scores (task 2). In case you are submitting results for only one task, fill the results for the other task with "-1"s. The row numbers indicate the testing case ID (for example, the sixth row contains the prediction for TUPAC-TE-006.svs).

Update 24/08: Results for the mitosis detection task (task 3) must be submitted as a .csv file (one for each image with the same directory structure as the image data - for example the results for the fifth case in the testing set will be written in 05/01.csv). Each row in the CSV file must correspond to one detected mitotic figure location. The first two columns must contain the image coordinates of the detection (row-column order). A third optional column in the CSV file can contain a confidence value for the detection. If the participants submit the results in this format, then they must provide a threshold for the confidence value, such that all objects with confidence above the threshold are considered detected mitotic figures. The threshold value must be identical for all HPFs from all patients. If the third column is not provided, all objects in the CSV file will be considered as detected mitotic figures. Although for the evaluation of results we do not consider the confidence values of the detected mitotic figures (see below), this information might be used in the summary paper to plot free response ROC curves or other similar graphs.

Each submission must be accompanied by a method description in a four page paper format (PDF, we recommend using the IEEE template) containing the following sections:

  • Introduction - overview of your approach and methods;
  • Methods - detailed description of the methods;
  • Experiments and Results - detailed description of the training experiments and procedure, and expected results (such as results on a subset of the training set used for validation)
  • Discussion (optional) - discussion of the methods and expected results;

The description of the methods and experiments must be detailed enough to enable reproducibility. If possible, we also ask that you provide links to code and/or trained models. The method description is only for the purpose of reviewing the submission and it will not be made publicly available. The authors are free to publish their methods and results independently after participation in the challenge. 

Each participating group or individual will be able to submit results up to three times before the challenge meeting.

The title of the results page must be in the format "Results by teamUsername submission #". Example: "Results by test 1". Any notes or remarks regarding the submission should be specified in the body of the results page. Please do not put the method description in the body of the results page. The submitted results and the method description should be added as file attachments to the results page.

Any previous submissions can be viewed in the right sidebar of this page. The results of the evaluation will be added as attachments to the results page by the organizers. You will be notified by email when the results of the evaluation are uploaded. We aim to evaluate all submitted results within three days after the submission.