-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Dear DOMjudge developers,
first of all thank you for writing DOMjudge. We are using it at TU München, Germany, for many courses and competitions. Having more than 100 contests and 50.000 submissions by 1000 teams we changed many parts of the system and created a fork of your project. We would like to contribute our changes to DOMjudge if you are interested in them.
You can see our website for instance at https://judge.in.tum.de/conpra (other sections are available at the top left corner of every page). The sources we use are available at https://git-judge.in.tum.de/icpc/tumjudge. We will create merge requests in case you are interested in our changes. We would do some things a little different when we don't work in our fork. For instance, we created new css and javascript files to avoid conflict when merging your changes.
These are the main features we added to the system:
* a modern layout based on bootstrap with a nicer menu bar and form controls * teams can also view, download and edit their submissions as jury members can do it already * an overview of points throughout all contests a team participated in, grouped by similar contest names * bonus points can be awarded to teams for individual problems and contests * the statistics page generates some LaTeX commands to include detailed statistics into solution slides * problems have additional fields author, difficulty, topic and source and there are filters for these fields * write access to user and team accounts can be restricted for admins (We run several DOMjudge instances for several courses since we don't want lecturers of different courses to adjust the system as they need it. User accounts are shared via a MySQL replication, which makes user data read-only for all but one instance, all other data, including roles, is not shared.) * links in the menu are customizable in the settings * scoreboards can be shuffled in case you don't want to reveal which problems were solved (see contest 6 at the link given above for an example) * the judge instances are integrated into our website, the news system and the account system of the university (probably nothing that somebody else would need)
We also created a framework to create random test data for problems from some seeds. This allows us to run failing submissions of teams on small random inputs until we find one that makes their submission fail. That way, we usually find much smaller inputs that are easier to debug. These scripts can be called from the jury backend and the output is also included into the backend.
We hope there are some changes that may be interesting for you. If this is the case please tell us. We can tell you more about what we did there and would like to contribute the corresponding changes.
Best regards, Stefan
Hi Stefan,
On 03-03-16 08:05, Stefan Toman wrote:
Dear DOMjudge developers,
first of all thank you for writing DOMjudge. We are using it at TU München, Germany, for many courses and competitions. Having more than 100 contests and 50.000 submissions by 1000 teams we changed many parts of the system and created a fork of your project. We would like to contribute our changes to DOMjudge if you are interested in them.
Thanks, we're always interested to hear what new features people added and to see if they make sense merge back into the main source.
You can see our website for instance at https://judge.in.tum.de/conpra (other sections are available at the top left corner of every page). The sources we use are available at https://git-judge.in.tum.de/icpc/tumjudge. We will create merge requests in case you are interested in our changes. We would do some things a little different when we don't work in our fork. For instance, we created new css and javascript files to avoid conflict when merging your changes.
These are the main features we added to the system:
- a modern layout based on bootstrap with a nicer menu bar and form
controls
We are currently also testing with using a PHP framework, see the framework-rewrite branch. Your site looks very good, but I guess we have to see how/if this could be combined.
- teams can also view, download and edit their submissions as jury
members can do it already
I think this is a nice feature that we even talked about already, and could be merged into the master branch.
- an overview of points throughout all contests a team participated
in, grouped by similar contest names
- bonus points can be awarded to teams for individual problems and
contests
These two features are more geared towards long-running "online judge" or "courseware" instances. We've been struggling how to deal with such features, since they may be useful in such cases, but could be considered feature bloat/creep for the basic system. We have a branch 'online-judge' for this, but that has never really lifted off...
- the statistics page generates some LaTeX commands to include
detailed statistics into solution slides
- problems have additional fields author, difficulty, topic and source
and there are filters for these fields
- write access to user and team accounts can be restricted for admins
(We run several DOMjudge instances for several courses since we don't want lecturers of different courses to adjust the system as they need it. User accounts are shared via a MySQL replication, which makes user data read-only for all but one instance, all other data, including roles, is not shared.)
- links in the menu are customizable in the settings
- scoreboards can be shuffled in case you don't want to reveal which
problems were solved (see contest 6 at the link given above for an example)
This may be a nice small feature if it can be implemented relatively cleanly.
- the judge instances are integrated into our website, the news system
and the account system of the university (probably nothing that somebody else would need)
We also created a framework to create random test data for problems from some seeds. This allows us to run failing submissions of teams on small random inputs until we find one that makes their submission fail. That way, we usually find much smaller inputs that are easier to debug. These scripts can be called from the jury backend and the output is also included into the backend.
Are you aware of (the now separate) checktestdata program and that it can also generate rudimentary testdata?
These scripts could possibly be added as contributory scripts under 'misc-tools' in the main branch.
We hope there are some changes that may be interesting for you. If this is the case please tell us. We can tell you more about what we did there and would like to contribute the corresponding changes.
Yes, I think there are certainly some interesting changes. Some other changes might not be as suitable for inclusion but could still be valuable to other people who'd like to modify DOMjudge for similar contexts, so it's great that you provide a public repository and your adaptations seem relatively clean at first glance. We could add a link to this.
Jaap