Dear AlphaPilot Contestants,

 

Here is all the information needed for you to format your Test 1 and 2 submissions due this Friday, March 8th at 5:00 pm ET in one place.

 

Test 1 Submission Requirements

For Test 1 submissions, each Team Captain should complete the HeroX Test 1 submission form which consists of 2 main components:

  1. Written-responses to the five questions
  2. Add a link to the team’s video on YouTube or Vimeo

Both parts must be submitted together via the Test #1 entry form. Teams may submit one and only one entry form, to be received from the Team Captain. Each written question has a limit of 4000 characters, including spaces. The video must be less than 3 minutes in length, and any video uploaded or updated after the final deadline will not be considered.

 

Test 2 Submission Requirements

For Test 2 submissions, each Team Captain should upload to the HeroX Test 2 submission form:

  1. 1 PDF file describing their algorithm in a 2-page Technical Report
  2. 1 zipped archive named ‘submission’ with maximum upload size of 250MB

In the folder named ‘submission’, at the highest level of the directory, please include the following:

  1. generate_results.py
    1. Within this script, teams need to define their algorithm within the GenerateFinalDetections() class in the predict(self,img) function. This function is called within the generate_submission.py test script that reads all test images, imports and calls a team’s algorithm, and outputs labels for all images in a JSON file.
  2. requirements.txt
    1. A file which lists libraries that need to be installed for an algorithm’s source code to run.
  3. Other code
    1. Teams can include additional code in the directory as needed for your algorithms to run. This can be compiled libraries, source code, etc.
    2. Teams SHOULD NOT include any of the following in their submitted archive:
      1. Any code not needed for their algorithms to run
      2. generate_submission.py as AlphaPilot will run our own version of this code
      3. Scoring scripts as AlphaPilot will run our own version of this code

A sample submission with example class and predict function is included in the starter scripts, and these help teams to format and define a solution.

 

Test 2 Algorithm Submission Checking

The final algorithm testing is completed automatically, and therefore, it is essential that teams follow the above requirements exactly. To reduce the possibility of small, technical errors causing issues for the teams, the AlphaPilot team has provided all the tools needed to check and validate Test 2 algorithm submissions.

The final algorithm testing process will be conducted as such:

  1. A new Linux user home directory is created. The team’s archive will be unzipped and unpacked into the user’s home directory. A virtual testing environment is setup in the user’s home directory. This testing environment is as previously described, but we have decided to also make OpenCV 4.0 available in the environment to teams.
  2. Team dependencies will be installed according to the libraries listed in a requirements.txt file using the following command: >> pip install -r requirements.txt
  3. An internal version of the testing script, generate_submission.py, will run through all available image data, implement the predict(self,img) function call to get the labels and execution time of team algorithms for a given image, and store the results for each image in a JSON file named ‘random_submission.json’. The internal version of generate_submission.py is functionally comparable to the version gives to teams and only differs in that it contains additional checks performed on algorithms.
  4. Once all results are stored in a JSON, the scoring script (score_detections.py) is run to compare the outputted JSON file to a master JSON containing the ground-truth labels. The total algorithm score is then calculated using the output from the scoring script as well as the execution time outputted from running generate_submission.py.

To check that your submitted source code will function correctly, we suggest teams follow the above process using the starter and scorer scripts previously provided.

 

Test 2 Source Code and IP Concerns

As a reminder, the judges will have access to teams’ submitted source code to help them understand the technical approach taken for Test 2. The AlphaPilot team suggests that if there are any concerns with IP, then that IP should be compiled into an executable that is called from your source code and that executable should be included in your archive. However, by abstracting functionality away in executables, it makes it more difficult for the judges to verify that teams have the technical skills and capabilities to succeed during the AlphaPilot Competition. As a result, please balance your team’s individual needs to obscure IP with building transparency of your approach.

 

As always, keep posting to the forum as needed!

 

Happy coding,
The AlphaPilot Team