Dear AlphaPilot Contestants,

Great work on Test 1 and 2! We are very excited about all the excellent submissions we received.

We want to ensure that all teams get the best chance to format their Test 3 submissions correctly given the new guidelines below, and as such, we have decided to make a final extension of the Test 3 deadline to Friday, March 22nd at 5:00PM EDT (New York). In this update is all the information needed for teams to format their Test 3 submissions correctly so please read below carefully.

 

Test 3 General Updates

In response to requests on the forum, we are providing a few updates to Test 3. As a few updates were made to the simulation, we suggest all teams pull the latest version of FlightGoggles from GitHub.

Some gate perturbations in Test 3 were placing gates very close to walls making them hard to traverse. Our intention is not to make teams do obstacle detection and avoidance in this test. As a result, the Challenge Files were regenerated, and the new Leaderboard Challenge Files can be downloaded here:

A few small updates were made to Test 3 including:

  • In FlightGoggles, the race time now starts when the drone is armed, so teams will get a chance to initialize algorithms
  • Minor bug fix on scorer.sh, which was indexing incorrectly
  • Minor bug fix on scorer.py, which was not able to figure out the path if run from a non-standard location

 

Test 3 Submission Requirements

For Test 3 submissions, each Team Captain should upload to the HeroX Test 3 submission form:

  • 1 PDF file describing their algorithm in a 2-page Technical Report
  • 1 zipped archive named ‘submission’ with maximum upload size of 1GB

We have set the maximum allowable upload size, but if you think your submission will exceeds this size limit, please contact the AlphaPilot team prior to submission as soon as possible. Teams will have internet access during installation but there will not be internet access during testing. Teams will not have root access at any point.

In your archived named ‘submission’, at the highest level of the directory, please include the following:

  1. scorer.launch
    1. Team archives must contain and edit `scorer.launch` to include all ROS nodes required for completing the challenges. This will be the launch file used to run your team’s algorithm.
  2. install.bash
    1. If desired, teams may include an install.bash file which can be run to automatically install algorithm dependencies. Your install.bash file can run commands such as catkin build, rosdep, etc. as needed. Alternatively, teams can include compiled libraries in their archive as noted below. Please note the limitation on installation time of 1 hour.
  3. catkin_ws
    1. Teams must include a catkin workspace with the team’s own ROS packages separate from the flightgoggles catkin workspace. It should be named ‘catkin_ws’ and include the following folders:
      1. src ‘src’ folder should include all source code for the team’s ROS packages
      2. devel – the ROS packages should already be compiled into the ‘devel’ folder, the development space, and should include setup.bash (which is usually autogenerated when you build with catkin)
  4. Compiled Libraries and External Code
    1. Teams can include additional code anywhere in the archive as needed for your algorithms to run. This can be compiled libraries, source code, etc. Note that the testing environment will have ROS Kinetic, OpenCV 3.4, and FlightGoggles already installed for teams so no need to include those.

 

Test 3 Algorithm Submission Testing

This testing environment is an instance of Ubuntu 16.04.5 LTS running in AWS type p3.2xlarge. The following instance will be used as the basis for the testing environment: https://aws.amazon.com/marketplace/pp/B077GCZ4GR

Additionally, this environment will have ROS Kinetic, OpenCV 3.4, and FlightGoggles already installed for teams.

The final algorithm testing process will be conducted as such:

1.     The Flight Goggles catkin workspace with the addition of “scorer.py” in flightgoggles/launch will be sourced.

2.     A new Linux user home directory is created. The team’s archive will be unzipped and ‘submission’ folder will be unpacked into the user’s home directory.

3.     A symbolic link “scorer.launch” will be added to flightgoggles/launch that links to “scorer.launch” in the submission root directory

4.     Team dependencies will be added by running the following command in the submission root directory:

>> ./install.bash && source catkin_ws/devel/setup.bash

Note: there is a cut-off for installation time for a team’s algorithm, and it is 1 hour. If a team’s source code takes longer than this to install dependencies, the script will be stopped, and the team will be given an Algorithm Score of 0 points.

5.     The scorer will run the launch file 25 times with the perturbed gates in `flightgoggles/config/gate_locations_x.yaml` and accumulate the results from the reporter in a `results` folder. At the end of the evaluation, `scorer.py` is run which generates a `scores.yaml` which contains the individual scores for each run. This will be implemented in testing using the following command:

>> rosrun flightgoggles scorer.sh

6.     The score.yaml file will be read, the top 5 scores will be found and averaged, and this final result will be used for the algorithm score.

Note: there is a cut-off for the total submission evaluation time (Final Testing steps 5 and 6) and that is 2 hours. If a team’s algorithm takes longer than this to be evaluated, the testing will be stopped, and the team will be given an Algorithm Score of 0 points.

 

Separately, judges will read and score each team’s technical report which will be subsequently used to calculate their total final score for Test 3.

 

Test 3 Submission Checking

It is essential that teams follow the above submission requirements exactly. However, in the event your team makes a small mistake, we have implemented a Submission Checker for Test 3. Each team has the opportunity to try to fix their submission up to 5 times before Monday, March 25th at 5PM EDT (New York) if their submission fails the algorithm submission testing described above. In the event of a submission failure, your team will receive an email with the error and will have a short window to re-submit. If we suspect teams are abusing this system, it will be investigated and scored accordingly.

 

Test 3 Source Code and IP Concerns

As a reminder, the judges will have access to teams’ submitted source code to help them understand the technical approach taken for Test 3. The AlphaPilot team suggests that if there are any concerns with IP, then that IP should be compiled into an executable that is called from your source code and that executable should be included in your archive. However, by abstracting functionality away in executables, it makes it more difficult for the judges to verify that teams have the technical skills and capabilities to succeed during the AlphaPilot Competition. As a result, please balance your team’s individual needs to obscure IP with building transparency of your approach.

 

Happy coding,
The AlphaPilot Team