Evaluating solutions

When the competition is over, copy all solutions submitted by the contestants to solutions/contestant/task. If you use our submitting system, you can call bin/mo-grab to do this.

Then you can call bin/ev contestant task to evaluate a single solution. (In some cases, you can also add the name of the solution as the third parameter, which could be useful for comparing different author's solutions.)

You can also use bin/mo-ev-all task names to evaluate solutions of the specified tasks by all contestants.

The open data problems are evaluated in a different way, you need to run bin/ev-open or bin/mo-ev-open-all instead.


For each solution evaluated, bin/ev creates the directory testing/contestant/task containing:


The solutions under evaluation run in a simple sandbox which restricts time, memory and system calls available to the program. You can set the sandbox options in the top-level config file, see bin/box --help for a list of the available ones.

Score table

The bin/mo-score utility can be used to generate a score table from all the points files in HTML. The quality of the output is not perfect, but it can serve as a basis for further formatting.