Menu
Loading...

Timetable

  • Challenge Registration and Dry run period**: Janurary 15 ~ May 1, 2022
  • Final Submissions: April 30 ~ May 1, 2022 (GMT+0 or UTC) (one-day only)
  • Winner & Runner-ups Code Submission (see guidelines below): May 5, 2022 (GMT+0 or UTC)
  • Winner Announcement: May 20, 2022
** Dry-run phase: A set of dry-run (validation) data is provided for debugging purpose only. This set is provided just to validate your prediction format (so you can work with our final testing in the end). The Top-1 Accuracy on this set is NOT accurate. Feedback are provided on this dry-run (validation) set only.

Prizes

  1. Action recognition from dark videos:
    1. Semi-supervised action recognition in the dark
      • Winner (1st): $1500
      • Runner-up (2nd): $1000
      • Runner-up (3rd): $700
For a total of about $2.5K awarded in prizes.

Submission Process

To avoid inconvenience during the competition, and to ensure a smooth submission process, please follow the following guidelines carefully.

  • Submission format (before winner validation): A zipfile containing a single "arid_pred.csv" file. A more detailed description could be found at the following codalab pages under "Participate -- Submit/View Results"
  • Codalab pages: Codalab Link
  • After creating an account or logging into an existing account, please go to the User Settings by clicking on your icon on the top right corner. Then in the "Competition settings" section, fill in the "Team name" with your Team Name as registered in the Registration Form.
  • If you have forgotten your Team Name, please contact us through email. The Team Name MUST be shown in the "Results" tab to be accounted for prize winning.
  • The "Team Name" in the Submission page should also match the Team Name as registered in the Registration Form.
  • Test videos: downloadable April 30 (GMT+0 00:00) ~ May 1 (GMT+0 00:00).
  • The limited number of submissions will be enforced at the team level. That means, no matter how many team members are in your team, each TEAM (as shown in the "Team Name" field) can only submit 5 times. Submissions of a team may come from different members, but only the (chronologically) first 5 submissions from that TEAM will be considered on the leaderboard.
  • Teams with Top-5 on the final testing Codalab leaderboard would go through a winner validation and confirmation process, see below for details.

If you have any questions about this challenge track please feel free to email cvpr2022.ug2challenge@gmail.com

Winner Validation and Confirmation Process

Teams with Top-5 on the final testing Codalab leaderboard would be required to submit their code for the winner validation and confirmation process. Please follow the instructions below carefully to ensure smooth confirmation process of your results:

  1. Download the training/validation/testing datasets. The testing dataset would be available for download on April 30 (GMT+0, 00:00).
  2. Train or finetune your model. For obtaining the final testing result, run your trained model on the testing set.
  3. Pack your code into one single .zip file. By default we would be using raw video as the input to your code. If your algorithm requires the use of individual video frame or other pre-processing algorithms, include you frame extraction or pre-processing code as well in the zip file.
  4. Include a detailed instruction of the placement of the training, validation and testing data (i.e. where to place the data to enable training and testing).
  5. You should contain at most three running scripts, one for data pre-processing (named run_preproc.sh), one for training (named run_train.sh) and one for testing (inference) (named run_test.sh). We will ONLY run the three scripts during the validation process.
  6. Including the trained weight of your algorithm that could lead to the result shown in the leaderboard is highly recommended.
  7. Each team must only submit ONLY ONE zip file with ONLY ONE SET of algorithm corresponding to the submission result in the testing leaderboard. The zip file should contain all dependencies and code required to perform the algorithm's operation.
  8. Additional data (i.e., all data not included in the training and testing set provided by UG2 2022) must also be included. For easy download and ensure success submission of the zip files, you may upload the additional data to a cloud server (i.e. Gdrive, Onedrive, Dropbox, or Baidu Netdisk) and include the shared link in your README within your submitted zip file. You should also include the detailed instructions on how to leverage on the additional data provided. If you leverage public datasets, you should include the download site of the data directly in the README with detailed instructions on how you download and use the public datasets. This is to enable us to reproduce your training as much as possible.
  9. NOTE: This validation process is MANDATORY for your results to be validated and counted towards the leaderboard and prizes. The goal is to ensure that the restrictions on data usage are NOT VIOLATED. The results reproduced by use DOES NOT serves as the end result as long as the restrictions are not violated and the reproduced results falls at a reasonable range towards the CODALAB results.

Running with NVIDIA Container Toolkit (nvidia-docker)

To ensure a smooth validation and confirmation process, we strongly recommend participants to run or verify their code with Docker and NVIDIA Container Toolkit (nvidia-docker). Here we provide detailed instructions to install and run nvidia-docker.

  1. Install NVIDIA GPU Driver using .run file. To ensure that you could run the required docker image, please install the GPU Driver with version >= 440.33.
  2. For the convenience of all participants, you may run the script provided here. Alternatively, follow the following steps.
  3. Install docker and nvidia-docker following the instructions provided by NVIDIA here. You may also choose to install older docker version following the instructions provided by Docker here.
  4. Our default docker image is pulled from NVIDIA NGC Catelog, the details of the docker is provided below. Follow the following two steps to pull and run the default docker image (also provided here or here):
    Using Pytorch:
    • docker pull nvcr.io/nvidia/pytorch:21.06-py3
    • docker run --gpus all -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/pytorch:21.06-py3 (Docker >= 19.03)
    • nvidia-docker run -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/pytorch:21.06-py3 (Docker <= 19.02)
    Using TensorFlow2:
    • docker pull nvcr.io/nvidia/tensorflow:21.06-tf2-py3
    • docker run --gpus all -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/tensorflow:21.06-tf2-py3 (Docker >= 19.03)
    • nvidia-docker run -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/tensorflow:21.06-tf2-py3 (Docker <= 19.02)
    Using TensorFlow1:
    • docker pull nvcr.io/nvidia/tensorflow:21.06-tf1-py3
    • docker run --gpus all -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/tensorflow:21.06-tf1-py3 (Docker >= 19.03)
    • nvidia-docker run -it --rm -v {local_dir}:{container_dir} nvcr.io/nvidia/tensorflow:21.06-tf1-py3 (Docker <= 19.02)
  5. Verify your algorithm using the running docker container. For more information on running docker containers, you may refer to the documentation here.

The use of Docker and nvidia-docker is NOT mandatory, but we would use the above docker images during the validation and confirmation process to ensure fair comparison between algorithms.

Requirements

Software
  • Docker-CE
  • NVIDIA Docker
  • NVIDIA GPU Driver >= 465.19.01
Docker

Our provided Docker will include the following software

For PyTorch:
  • Ubuntu 20.04
  • PyTorch 1.9.0a0+c3d40fd8
  • NVIDIA CUDA 11.3.1
  • NVIDIA cuDNN 8.2.1
  • TensorBoard 1.15.5
For Tensorflow:
  • Ubuntu 20.04
  • TensorFlow 1.15.5 or TensorFlow 2.5.0
  • NVIDIA CUDA 11.3.1
  • NVIDIA cuDNN 8.2.1
Hardware

The proposed algorithms should be able to run in systems with:

  • Up to and including RTX 3080 12 GB
  • Up to and including 12 cores
  • Up to and including 32gb memory

If you have any questions about this challenge track please feel free to email cvpr2022.ug2challenge@gmail.com

Rules

Read carefully the following guidelines before submitting. Methods not complying with the guidelines will be disqualified.

  • We encourage participants to use the provided training and validation data for each task, as well as to make use of their own data or data from other sources for training. However the use any form of manual annotation or use of any of the provided benchmarks test sets for either supervised or unsupervised training is strictly forbidden. A detailed restriction on the data usage for UG2 2022 Track 2 has been sent through email, titled as "Clarification on constraints of data usage for UG2 2022 Track 2". Please read the instructions carefully.
  • Team name of submissions on Codalab must match the registration information. Any submission with a team name not registered will not be qualified for prizes. The Team Name MUST be shown in the "Results" tab to be qualified for. For steps to ensure the "Tean Name" to be shown, follow the "SUBMISSION PROCESS" above. Only a single submission per team can be the winner of of this sub-challenge. Changes in algorithm parameters do not constitute a different method, all parameter tuning must be conducted using the dataset provided and any additional data the participants consider appropriate.
  • Results can vary based on the GPU used. We noticed that results varied slightly across GPU models. We are restricting our evaluation to one model (RTX 3080) and will encourage participants to use this model in their development.

Eligibility

  • Foreign Nationals and International Developers: All Developers can participate with this exception: residents of, Iran, Cuba, North Korea, Crimea Region of Ukraine, Sudan or Syria or other countries prohibited on the U.S. State Department’s State Sponsors of Terrorism list. In addition, Developers are not eligible to participate if they are on the Specially Designated National list promulgated and amended, from time to time, by the United States Department of the Treasury. It is the responsibility of the Developer to ensure that they are allowed to export their technology solution to the United States for the Live Test. Additionally, it is the responsibility of participants to ensure that no US law export control restrictions would prevent them from participating when foreign nationals are involved. If there are US export control concerns, please contact the organizers and we will attempt to make reasonable accommodations if possible.

  • If you are entering as a representative of a company, educational institution or other legal entity, or on behalf of your employer, these rules are binding on you, individually, and/or the entity you represent or are an employee. If you are acting within the scope of your employment, as an employee, contractor, or agent of another party, you warrant that such party has full knowledge of your actions and has consented thereto, including your potential receipt of a prize. You further warrant that your actions do not violate your employer’s or entity’s policies and procedures.

  • The organizers reserve the right to verify eligibility and to adjudicate on any dispute at any time. If you provide any false information relating to the prize challenge concerning your identity, email address, ownership of right, or information required for entering the prize challenge, you may be immediately disqualified from the challenge.

  • Accounts. For the final testing phase, you may make submissions only under one, unique registration per team. Submission should be made such that the "Team Name" field in Codalab and the "Team Name" as shown in the "Results" tab matches your team name in the registration. You will be disqualified if you make submissions for your final testing phase through more than one registration or if your team name cannot be found in the registration or if your team name is NOT shown in the "Results" tab. Each TEAM may submit up to ten submissions per day per challenge. For the final winner validation/confirmation process, eligible teams should submit code containing only ONE algorithms per TEAM. Any submissions that does not adhere to this during the testing or winner validation/confirmation process may be subject to disqualification.

The organizers reserve the right to disqualify any participating team for any of the reasons mentioned above and if deemed necessary.

Warranty, indemnity and release

You warrant that your Submission is your own original work and, as such, you are the sole and exclusive owner and rights holder of the Submission, and you have the right to make the Submission and grant all required licenses. You agree not to make any Submission that: (i) infringes any third party proprietary rights, intellectual property rights, industrial property rights, personal or moral rights or any other rights, including without limitation, copyright, trademark, patent, trade secret, privacy, publicity or confidentiality obligations; or (ii) otherwise violates any applicable state or federal law.

To the maximum extent permitted by law, you indemnify and agree to keep indemnified challenge Entities at all times from and against any liability, claims, demands, losses, damages, costs and expenses resulting from any act, default or omission of the entrant and/or a breach of any warranty set forth herein. To the maximum extent permitted by law, you agree to defend, indemnify and hold harmless the challenge Entities from and against any and all claims, actions, suits or proceedings, as well as any and all losses, liabilities, damages, costs and expenses (including reasonable attorneys fees) arising out of or accruing from: (a) your Submission or other material uploaded or otherwise provided by you that infringes any copyright, trademark, trade secret, trade dress, patent or other intellectual property right of any person or entity, or defames any person or violates their rights of publicity or privacy; (b) any misrepresentation made by you in connection with the challenge; (c) any non-compliance by you with these Rules; (d) claims brought by persons or entities other than the parties to these Rules arising from or related to your involvement with the challenge; and (e) your acceptance, possession, misuse or use of any Prize, or your participation in the challenge and any challenge-related activity.

You hereby release organizers from any liability associated with: (a) any malfunction or other problem with the challenge Website; (b) any error in the collection, processing, or retention of any Submission; or (c) any typographical or other error in the printing, offering or announcement of any Prize or winners.

Competition Framework

The Participant has requested permission to use the dataset as compiled by the Institute for Infocomm Research, A*STAR, Singapore, Nanyang Technological University, Singapore, NVIDIA AI Technology Centre and University of Texas at Austin. In exchange for such permission, Participant hereby agrees to the following terms and conditions:

  • The Institute for Infocomm Research, A*STAR, Singapore, Nanyang Technological University, Singapore, NVIDIA AI Technology Centre and University of Texas at Austin make no representations or warranties regarding the Dataset, including but not limited to warranties of non-infringement or fitness for a particular purpose.

  • Pre-trained models are allowed in the competition.

  • Participants are not restricted to train their algorithms on the provided training set. Collecting and training on additional data is encouraged.

Creative Commons License
The released dataset is licensed under a Creative Commons Attribution 4.0 International License.

Contact

If you have any questions about this challenge track please feel free to email cvpr2022.ug2challenge@gmail.com

Footer