After installation is successful, you want to run your contest! Configuring DOMjudge to run a contest (or a number of them, in sequence) involves the following steps:
DOMjudge stores and retrieves most of its data from the MySQL database. Some information must be filled in beforehand, other tables will be populated by DOMjudge.
You can use the jury web interface to add, edit and delete most types of data described below. It's advised to keep a version of phpMyAdmin handy in case of emergencies, or for general database operations like import and export.
This section describes the meaning of each table and what you need to put into it. Tables marked with an `x' are the ones you have to configure with contest data before running a contest (via the jury web interface or e.g. with phpMyAdmin), the other tables are used automatically by the software:
auditlog | Log of every state-changing event. | |
balloon | Balloons to be handed out. | |
clarification | Clarification requests/replies are stored here. | |
x | configuration | Runtime configuration settings. |
x | contest | Contest definitions with start/end time. |
x | contestproblem | Coupling of problems to contests and data specific to it. |
x | contestteam | Coupling of teams to contests. |
event | Log of events during contests. | |
x | executable | Executable compile/run/compare scripts. |
judgehost | Computers (hostnames) that function as judgehosts. | |
x | judgehost_restriction | Optional restriction sets on submissions taken by judgehosts. |
judging | Judgings of submissions. | |
judging_run | Result of one testcase within a judging. | |
x | language | Definition of allowed submission languages. |
x | problem | Definition of problems (name, timelimit, etc.). |
rankcache_jury | Cache of team ranking data for public/teams and for the jury | |
rankcache_public | separately, because of possibility of score freezing. | |
rejudging | Metadata for batched rejudging. | |
role | Possible user roles. | |
scorecache_jury | Cache of the scoreboards for public/teams and for the jury | |
scorecache_public | separately, because of possibility of score freezing. | |
submission | Submission metadata of solutions to problems. | |
submission_file | Submitted code files. | |
x | team | Definition of teams. |
x | team_affiliation | Definition of institutions a team can be affiliated with. |
x | team_category | Different category groups teams can be put in. |
team_unread | Records which clarifications are read by which team. | |
x | testcase | Definition of testdata for each problem. |
x | user | Users that will able to access the system. |
x | userrole | Mapping of users to their roles. |
Now follows a longer description (including fields) per table that has
to be filled manually. As a general remark: almost all tables have an
identifier field. Most of these are numeric and automatically
increasing; these do not need to be specified. The tables
executable
and language
have text strings as identifier
fields. These need to be manually specified and only alpha-numeric, dash and
underscore characters are valid, i.e. a-z, A-Z, 0-9, -, _
.
This table contains configuration settings.
These entries are simply stored as name, value
pairs, where
the values are JSON encoded, type
contains the allowed data
type, and description
documents the configuration setting.
The contests that the software will run. E.g. a test session and the live contest.
cid
is the reference ID and contestname
is a
descriptive name used in the interface, while shortname
is
the publicly visible identifier.
activatetime
, starttime
and endtime
are required fields and specify when this contest is active and
open for submissions. Optional freezetime
and
unfreezetime
control scoreboard freezing and
deactivatetime
when the contest is not visible anymore.
For a detailed treating of these, see section
Contest milestones.
All contest times can be specified relative to starttime
,
except of course starttime
itself. The input given in the
jury interface (either relative or absolute) is stored in the
*time_string
fields, while a calculated absolute version is
stored in the fields without the _string
suffix.
The public
field can be used to limit which contests are
displayed as public scoreboards (as opposed to privately to a selected
set of teams), while enabled
can be used to (temporarily)
disable a contest altogether.
This table couples problems to contests: cid
and
probid
describe the pairing.
Furthermore, it stores problem data that is specific for the included
contest: shortname
is a contest-unique identifier string for
the problem, points
defaults to 1 and can be set to assign
non-even scoring; allow_submit
determines whether teams can submit
solutions for this problem. Non-submittable problems are also not
displayed on the scoreboard. This can be used to define spare
problems, which can then be added to the contest quickly;
allow_judge
determines whether judgehosts will judge
submissions for this problem. See also the explanation for language.
The color
tag can be filled with a CSS colour specification
to associate with this problem; see also section
Scoreboard: colours.
This table couples teams to contests. Teams can only submit solutions to problems in contests that are public or which they are part of.
This table stores zip-bundles of executable scripts that can be used as compile, run, and compare scripts.
This table encodes restriction sets for selecting which submissions
are sent to a judgehost. The restrictions are JSON encoded in
the restrictions
column, and can be set in the admin web
interface to restrict on specific contests, problems, languages, and
to never rejudge on the same judgehost. A restriction set can be
assigned to judgehost(s) on the edit page of the judgehosts overview.
Programming languages in which to accept and judge submissions.
langid
is a string of maximum length 8, which references the
language.
name
is the displayed name of the language;
extensions
is a JSON encoded list of recognized filename extensions;
allow_submit
determines whether teams can submit
using this language; allow_judge
determines whether
judgehosts will judge submissions for this problem. This can for
example be set to no to temporarily hold judging when a problem occurs
with the judging of a specific language; after resolution of the
problem this can be set to yes again.
time_factor
is the relative factor by which the timelimit is
multiplied for solutions in this language; compile_script
refers to a compile executable script that is used for this language.
This table contains the problem definitions. probid
is the
reference ID, cid
is the contest ID this problem is (only)
defined for: a problem cannot be used in multiple contests.
name
is the full name (description) of the problem.
allow_submit
determines whether teams can submit
solutions for this problem. Non-submittable problems are also not
displayed on the scoreboard. This can be used to define spare
problems, which can then be added to the contest quickly;
allow_judge
determines whether judgehosts will judge
submissions for this problem. See also the explanation for language.
timelimit
is the timelimit in seconds
within which solutions for this problem have to run (taking into
account time_factor
per language). See also
enforcement of time limits for
more details.
memlimit
is the memory limit in kB allotted for this problem.
If empty then the global configuration setting memory_limit
is used. Equivalently for outputlimit
.
special_run
if not empty defines a custom run program
run_<special_run>
to run compiled submissions for
this problem and special_compare
if not empty defines a
custom compare program compare_<special_compare>
to
compare output for this problem.
The color
tag can be filled with a CSS colour specification
to associate with this problem; see also section
Scoreboard: colours.
In problemtext
a PDF, HTML or plain text document can be
placed which allows team, public and jury to download the problem
statement. Note that no additional filtering takes place, so HTML
(and PDF to some extent) should be from a trusted source to prevent
cross site scripting or other attacks. The file type is stored in
problemtext_type
.
Table of teams: teamid
is (internal) ID of the team,
while externalid
can be used to store an ID for im/exporting
to other systems.
name
the displayed name of the team, categoryid
is
the ID of the category the team is in; affilid
is the
affiliation ID of the team.
When enabled
is set to 0, the team immediately disappears from
the scoreboards and cannot use the team web interface anymore, even
when already logged in. One use case could be to disqualify a team on
the spot.
members
are the names of the team members, separated by
newlines and room
is the location or room of the team, both for
display only; comments
can be filled with arbitrary useful
information and is only visible to the jury.
The timestamp teampage_first_visited
and the hostname
field indicate when/whether/from where a team visited its team web interface.
The penalty
field can be used to give this team a (positive
or negative) number of penalty minutes to correct for exceptional
circumstances.
affilid
is the reference ID and name
the name of the
institution. country
should be the 3 character
ISO 3166-1 alpha-3 abbreviation
of the country and comments
is a free form field
that is displayed in the jury interface.
A country flag can be displayed on the scoreboard. For this to work,
the country
field must match a (flag) picture in
www/images/countries/<country>.png
. All
country flags are present there, named with their 3-character ISO
codes. See also www/images/countries/README
.
categoryid
is the reference ID and name
is a string:
the name of the category. sortorder
is the order at which
this group must be sorted in the scoreboard, where a higher number
sorts lower and equal sort depending on score.
The color
is again a CSS colour specification used to
discern different categories easily. See also section
Scoreboard: colours.
The visible
flag determines whether teams in this category
are displayed on the public/team scoreboard. This feature can be used
to remove teams from the public scoreboard by assigning them to a
separate, invisible category.
The testcase table contains testdata for each problem;
testcaseid
is a unique identifier, input
and
output
contain the testcase input/output and image
an optional graphical representation of the testcase for the jury.
The fields md5sum_input
, md5sum_output
,
and md5sum_image
contain their respective md5
hashes to check for up-to-date-ness of cached versions by the
judgehosts and image_thumb
and image_type
a
thumbnail version and mimetype string for the image.
The field probid
is the corresponding problem and
rank
determines the order of the testcases for one problem.
description
is an optional description for this testcase. See
also
providing testdata.
This table has the users that the system knows about with their login credentials. Each user may have one or more roles, like being part of a team, being a jury member or administrator. There are also functional accounts, like for judgedaemons.
The contest
table specifies timestamps for each contest
that mark specific milestones in the course of the contest.
The triplet activatetime, starttime and endtime define when the contest runs and are required fields (activatetime and starttime may be equal).
activatetime is the moment when a contest first becomes visible to the public and teams . Nothing can be submitted yet and the problem set is not revealed. Clarifications can be viewed and sent.
At starttime, the scoreboard is displayed and submissions are accepted. At endtime the contest stops. New incoming submissions will still be processed and judged, but the result will not be shown anymore to teams; they instead receive the verdict`too-late'. Unjudged submissions received before endtime will still be judged normally.
freezetime and unfreezetime control scoreboard freezing. freezetime is the time after which the public and team scoreboard are not updated anymore (frozen). This is meant to make the last stages of the contest more thrilling, because no-one knows who has won. Leaving them empty disables this feature. When using this feature, unfreezetime can be set to automatically `unfreeze' the scoreboard at that time. For a more elaborate description, see also section Scoreboard: freezing and defrosting.
The scoreboard, results and clarifications will remain to be displayed to team and public after a contest, until the deactivatetime.
All events happen at the first moment of the defined time. That is: for a contest with starttime "12:00:00" and endtime "17:00:00", the first submission will be accepted at 12:00:00 and the last one at 16:59:59.
The following ordering must always hold: activatetime <= starttime < (freezetime <=) endtime (<= unfreezetime) (<= deactivatime).
The authentication system lets the domserver know which user it is dealing with and which role(s) the user has. This system is modular, allowing flexible addition of new methods, if required. The following methods are available by default for authentication.
Each user receives a password and PHP's session management is used to
keep track of which user is logged in. This method is easiest to
setup. It does require the administrator to generate users and passwords
for all teams (this can be done in the jury interface) and distribute those,
though. Also, each team has to login each time they (re)start their
browser. The password is stored in a salted MD5 hash in the
password
field in database (user table).
The IP-address of a team's workstation is used as the primary means of authentication. The system assumes that someone coming from a specific IP is the user with that IP listed in the user table. When a team browses to the web interface, this is checked and the appropriate team page is presented.
This method has the advantage that teams do not have to login. A requirement for this method is that each team computer has a separate IP-address from the view of the domserver, though, so this is most suitable for onsite contests and might not work with online contests if multiple teams are located behind a router, for example.
There are two possible ways of configuring team IP-addresses.
Before the contest starts, when entering teams into the database, add
the IP that each team will have to that user's entry in the
ip_address
field. When the teams
arrive, everything will work directly and without further
configuration (except when teams switch workplaces). If possible, this
is the recommended modus operandi, because it's the least hassle just
before and during the contest.
Supply the teams with a one time username and password with which to authenticate. Beforehand, generate passwords for each team in the jury interface. When the test session (or contest) starts and a team connects to the web interface and have an unknown IP, they will be prompted for username and password. Once supplied, the IP is stored and the password is removed and not needed anymore the next time.
This is also a secure option, but requires a bit more hassle from the teams, and maybe from the organisers who have to distribute pieces of paper.
Note: the web interface will only allow a team to authenticate themselves once. If an IP is set, a next authentication will be refused (to avoid trouble with lingering passwords). In order to fully re-authenticate a team, the IP address needs to be unset. You might also want to generate a new password for this specific user. Furthermore, a team must explicitly connect to the team interface URL, because with an unknown IP, the root DOMjudge website will redirect to the public interface.
This method can be useful when you want to integrate DOMjudge into a
larger system, or already have credentials on an LDAP server
available. The username
field in the database must contain
the LDAP username of the DOMjudge user. Furthermore, in
etc/domserver-config.php
the LDAP_*
configuration
settings must be adapted to your setup. Note that multiple (backup)
servers can be specified: they are queried in order to try to
successfully authenticate. After successful authentication against the
LDAP server(s), PHP sessions are used to track login into DOMjudge.
This method automatically authenticates each connection to the web interface as a fixed, configurable user. This can be useful for testing or demonstration purposes, but probably not for real use scenarios.
The authentication system is modular and adding new authentication
methods is fairly easy. The authentication is handled in the file
lib/www/auth.php
. Adding a new method amounts to editing
the functions in that file to handle your specific case.
Testdata is used to judge the problems: when a submission run is given the
input testdata, the resulting output is compared to the reference
output data using a compare script. The default compare
script simply checks if the outputs are equal up to whitespace
differences, but more elaborate comparisons can be done, see e.g.
the float
and boolfind_cmp
scripts.
The database has a separate table named testcase, which can be manipulated from the web interface. Under a problem, click on the testcase link. There the files can be uploaded. The judgehosts cache a copy based on MD5 sum, so if you need to make changes later, re-upload the data in the web interface and it will automatically be picked up.
Testdata can also be imported into the system from a problem zip file, following the Kattis problem package format.
Once everything is configured, you can start the daemons. They all run as a normal user on the system. The needed root privileges are gained through sudo only when necessary.
-n X
option should be used to bind a judgedaemon to CPU X to prevent
CPU resource conflicts).If the daemons have started without any problems, you've come a long way! Now to check that you're ready for a contest.
First, go to the jury interface:
http(s)://yourhost.example.edu/domjudge/jury
. Look under all
the menu items to see whether the displayed data looks sane. Use the
config-checker under `Admin Functions' for some sanity checks on your
configuration.
Go to a team workstation and see if you can access the team page and if you can submit solutions.
Next, it is time to submit some test solutions. If you have the default
Hello World problem enabled, you can submit some of the example sources
from under the doc/examples
directory. They should give `CORRECT'.
You can also try some (or all) of the sources under
tests
. Use make check
to submit a variety of
tests; this should work when the submit client is available and the
default example problems are in the active contest. There's also
make stress-test
, but be warned that these tests might crash
a judgedaemon. The results can be checked in the web interface; each
source file specifies the expected outcome with some explanations. For
convenience, there is a link judging verifier in the admin
web interface; this will automatically check whether submitted sources
from the tests
directory were judged as expected. Note that a
few sources have multiple possible outcomes: these must be verified
manually.
When all this worked, you're quite ready for a contest. Or at least, the practice session of a contest.
Before running a real contest, you and/or the jury will want to test the jury's reference solutions on the system.
The simplest way to do this is to include the jury solutions in a problem zip file and upload this. You can also upload a zip file containing just solutions to an existing problem. Note that the zip archive has to adhere to the Kattis problem package format. For this to work, the jury/admin who uploads the problem has to have an associated team to which the solutions will be assigned. The solutions will automatically be judged if the contest is active (but it need not have started yet). You can verify whether the submissions gave the expected answer from the link on the jury/admin index page.