61,484 accept challenge
AlphaPilot – Lockheed Martin AI Drone Racing Innovation Challenge

AlphaPilot – Lockheed Martin AI Drone Racing Innovation Challenge

AlphaPilot is the first large-scale open innovation challenge of its kind focused on advancing artificial intelligence (AI) and autonomy. Read Overview...
Overview

Upcoming Event: Join us for a Live Q & A webinar with Lockheed Martin, MIT, and the Drone Racing League this Thursday, February  22 at 10 AM PST! Register Now.

 

Calling all coders, gamers, race fans, and drone enthusiasts...

 

Lockheed Martin and The Drone Racing League (DRL) challenge you to participate in AlphaPilot, an open innovation challenge, developing artificial intelligence (AI) for high-speed racing drones.  

 

Enter the AlphaPilot Innovation Challenge today for your chance to master autonomous flight and win more than $2,000,000 in cash prizes.

 

AlphaPilot will challenge teams of up to 10 participants each to design an AI framework, powered by the NVIDIA Jetson platform for AI at the edge, that is capable of flying a drone -- without any human intervention or navigational pre-programming. Autonomous drones will race head-to-head through complex, three-dimensional tracks in DRL’s new Artificial Intelligence Robotic Racing (AIRR) Circuit, starting in 2019.

AlphaPilot aims to unite a diverse community of practicing and emerging AI experts, researchers and students to inspire the next generation of autonomous drone technology. By participating in this challenge, your knowledge and ideas can contribute directly toward the future of autonomous transportation, delivery, disaster relief, and even space exploration!

 

 

Why Drones?

Drone racing is a futuristic sport, and a big draw for millennials and K-12 students with an interest in technology — many of whom will become future STEM professionals, drone pilots and engineers. Lockheed Martin recognizes the important role in helping to develop a workforce with skills to compete in a 21st century high-tech economy. Lockheed Martin and DRL are targeting U.S. undergraduate and graduate students to apply for AlphaPilot; however, the competition is open to drone enthusiasts, coders and technologists of all ages from around the world.

 

Why is Lockheed Martin doing this?

For more than 100 years, Lockheed Martin has been redefining flight — from the fastest speeds, to the edge of space, to unmatched maneuverability and stealth. AI-enabled autonomy promises to fundamentally change the future of flight, and we are actively developing disruptive new AI technologies that will help our customers accomplish their most important missions – from reaching Mars to fighting wildfires.

 

WHAT CAN I DO RIGHT NOW?

  • Click ACCEPT CHALLENGE above to apply for AlphaPilot.
  • Read the Challenge Guidelines to learn about competition.
  • Share this challenge on social media using the icons above. Show your friends, your family, or anyone you know who has a passion for discovery.
  • Start a conversation in our Forum to join the discussion, ask questions or connect with other innovators.

ABOUT LOCKHEED MARTIN

Headquartered in Bethesda, Maryland, Lockheed Martin is a global security and aerospace company that employs approximately 100,000 people worldwide and is principally engaged in the research, design, development, manufacture, integration and sustainment of advanced technology systems, products and services. This year, the company received three Edison awards for groundbreaking innovations in autonomy, satellite technology and directed energy. For more information, please visit www.lockheedmartin.com/alphapilot.

 

 

 

ABOUT DRONE RACING LEAGUE

DRL is the professional drone racing circuit for elite FPV pilots around the world. A technology, sports and media company, DRL combines world-class media and proprietary technology to create thrilling 3D drone racing content with mass appeal. In 2018 DRL hosted a global series of seven races, the Allianz World Championship, that is airing on ESPN, Sky Sports, ProSiebenSat.1 Media SE, Groupe AB, Disney XD, OSN, and FOX Sports Asia. For more information, please visit www.drl.io

Guidelines
Timeline
Updates 10
Forum 51
Community 2K
Resources
FAQ
Sponsors
Test 1
Test 2
Test 3

AlphaPilot Qualifier

Test #3 – Guidance, Navigation & Control

 

Overview:

The third component of AlphaPilot qualifications will focus on a team’s ability to design algorithms for the guidance, navigation, and control (GNC) of an autonomous drone. The test will utilize a simulator framework that provides users with the tools needed to test their drone racing algorithms using realistic dynamics and exteroceptive sensors. These skills are essential for competition in AlphaPilot and the test is considered a precursor to work conducted by Finalist teams in preparation for each AIRR race event.

 

Updates

What’s New? What’s Different?

The details of the Final Challenge Course and the Leaderboard Challenge Course are now out!

Teams can download the Leaderboard Challenge Files now (Challenge_Leaderboardtest.zip) and submit their Scores YAML file to the leaderboard. The first round on the leaderboard will close on Monday, Feb. 18th at midnight PST, and any submissions will appear on the leaderboard by 9AM PST the following day. After this, the leaderboard will refresh daily until the final deadline.

For more details on the evaluation, resources, algorithm and submission requirements, and testing, please read the following sections. A summary of the most frequently asked questions on the Test 3 Forum Questions page and their responses will be continually added to the new ‘FAQ’ section at the end of this tab.

What’s yet to come?

Updates will be posted as needed. Otherwise, teams should submit their algorithm source code archive and technical report by March 8th! Good luck!!

 

Evaluation:

Goal

Teams must develop GNC algorithms to fly a simulated drone through a structured test environment utilizing a typical ACRO/RATE flight mode for control inputs and exteroceptive sensors for feedback. Teams are tasked with:

  1. Developing GNC algorithms to pilot an autonomous drone through FlightGoggles
  2. Describing their GNC algorithms in a 2-page Technical Report

The GNC algorithms must be capable of navigating through gates (see Figure 1) in a FlightGoggles challenge course. The objective is to pass through all gates in the defined order, as quickly as possible, without crashing.

 

Figure 1: Shows a screenshot of an example drone race course in MIT’s FlightGoggles Simulation. 

Test 3 Scoring

Each team will receive a Test #3 score (max 100 points) that combines an objective score from their racing performance with a qualitative review of their written submission:

  • Algorithm Score: 70% of total Test 3 score
  • Technical Report Score: 30% of total Test 3 score

 

Algorithm Score

Each team’s drone performance will receive an Algorithm Score (max 70 points) that is based on a metric evaluating the ability of their submitted algorithms to navigate through race gates. Total lap time will be the primary measure of performance, along with points awarded for each successful gate fly-through. The Reporter node in FlightGoggles tracks these metrics and outputs this score per “run”:

Run Score = +10 pts per gate successfully passed – race completion time (in seconds)

Since algorithms are tested on 25 slight variations (“runs”) of the same challenge course, the Racing Score is then calculated by looking at the best 5 runs from the 25 and averaging them. Since it is possible to hit obstacles and crash, simulator runs with a crash or object strike will receive 0 pts.

This averaged Racing Score is reported on the Leaderboard and scaled slightly for the Final Algorithm Score, so the best team receives the maximum allowable 70 pts.

 

Technical Report Score

Each team’s Technical Report Score (max 30 pts.) is based on a rubric evaluated by judges from academia, industry, and government. The judges will review each report for technical merit, applicability to AlphaPilot, and presentation and clarity of the approach.

 

Resources:

Test #3 is built on the Massachusetts Institute of Technology (MIT) FlightGoggles simulator, a first-of-its-kind virtual reality environment for drone research and development. MIT has modified FlightGoggles to include software for a drone racing simulation environment based on the Unity3D game engine. It includes Robotic Operating System (ROS) binding for integrating autonomy software and Unity assets for environments that resemble drone racing scenarios.

The FlightGoggles simulator and several training courses are now available open-source to the public and are hosted on GitHub. You can find the source code for the simulator, more information about how to get started, and some example challenge files here: http://flightgoggles.mit.edu/

 

Vehicle Model:

The simulator emulates a high-performance drone racer with high pitch rate and collective thrust command inputs. Note that this drone is not an exact model of the one used by AlphaPilot in AIRR race events. A Vehicle Dynamics node will keep track of the simulation clock and adjust the simulation clock ratio, which can be seamlessly lowered in real-time if the host computer is unable to provide real-time performance due to high-load or can be raised if faster real-time simulation is desired. More details about the drone (vehicle dynamics, sensor models, specifications, etc.) will be added to the GitHub repo.

 

Training Courses & Course Developer:

The simulator environment includes several pre-developed challenge courses and an open-map course creator for easy development and testing. Teams can use both resources for practice and may submit performance data on a Leaderboard Challenge Course to be included on a HeroX Leaderboard. The Final Challenge Course files will be developed by the AlphaPilot administration and kept on a separate instance. This course will be used for scoring submitted algorithms and determining teams final Test 3 Algorithm Scores.

The simulator is run with a Challenge File in YAML format which corresponds to a challenge course, and it will produce a Results YAML File after it is done. The Challenge File describes the following details:

  • Initial position and orientation of the drone
  • Challenge Name
  • Timeout Specification
  • A list of gate names
  • Width of all gates
  • Location of all the gates, determined by four points that make a rectangle

 

Simulator Output:

The simulator is equipped with a Reporter node that tracks ground truth information and outputs, via command line, and yields metrics on algorithm performance as a Results YAML File for each run on a Challenge YAML File. The file will contain metrics on drone performance attributes useful in logging, evaluating, and visualizing algorithm performance.

Additionally, the Simulator Reporter will output a YAML file with details about the run including output one of the following results:

  • Interrupted: If the grader process is interrupted, e.g., via ctrl+c.
  • Timeout: The timeout on the challenge file is reached
  • Crashed: The drone crashed into an obstacle in the sim
  • Completed: The drone passed through the last gate in the sequence.

The Reporter will also keep track of when the drone reaches each gate and record the time. If a gate in order is skipped, then the grader will put down “Success: False” for that gate.

With the Leaderboard Challenge Course files, teams have scripts to help with scoring the Results YAML file and outputting a Scores YAML file. See the included ‘Scorer Scripts’ in the challenge files as well as the Readme.md file for more details on running these scripts. Teams may submit their Scores YAML file of their summarized results for scoring and placement on the Leaderboard.

The Final Challenge Course files will not be released to teams and will be used to evaluate each team’s algorithm and determine their final FlightGoggles score. See the ‘Testing’ section below for more details on Leaderboard and Final Testing.

 

Tech Support:

Teams may file a report on the simulator GitHub page if they encounter any bugs or errors. Issues will be addressed by the MIT team as quickly as possible. Any specific questions about the AlphaPilot Test 3 challenge should be posted and addressed on the HeroX Test 3 Forum page.

 

Scorer Scripts

Challenge_Leaderboardtest.zip\Challenge_Leaderboardtest\launch – scripts to score Results YAML file and output Scores YAML file. See the Readme.md file for more details on running these scripts.

 

Submission Requirements:

Teams will submit two attachments via the Test #3 entry form:

(1) Autonomous drone racer binaries and source code in a zipped archive

(2) Technical Report of the drone racer in PDF, maximum 2 pages.

 

Algorithm Requirements:

Teams must develop GNC algorithms to fly the simulated drone through a structured test environment utilizing a typical ACRO/RATE flight mode for control inputs and only on-board sensory feedback. As is typical in drone racing, teams know the approximate locations of their starting point and the gate locations, and they must deal with slight variations during the actual race. In Test 3, teams are challenged to create a single algorithm that can be tested on 25 slight variations of the same Challenge Course.

Teams are given Challenge YAML files on which teams will be tested that indicate the location of the initial conditions for each exam. For Leaderboard Testing, teams are given all 25 Challenge YAML Files and self-report their scores. For Final Testing, teams are not given the variations on the Challenge YAML Files, and their algorithms will be run and scored by HeroX.

Algorithms are not allowed to use ground-truth state data; only inputs from their sensors. These vehicle and sensor feeds are modelled to include real-world errors like thrust-limited motors and rate-based control drift. Teams will have access to this noisy data, as well as some visual processing outputs. In summary, the allowable sensory inputs for algorithms include:

  • Photorealistic 60Hz RGB camera feeds – mono and stereo allowed
  • “Perfect” (not noisy) gate detection algorithm output (polygon coordinates) in the camera frame
  • “Perfect” (not noisy) IR marker locations in the camera frame
  • Noisy IMU data
  • Noisy laser range-finder data for height estimation – sensor points perpendicular to ground and drone when both are level

Using these sensory inputs, teams are permitted to conduct any processing that they desire. No visual-based obstacle avoidance is necessary (course paths are mostly clear of obstacles). That being said, it is possible to crash into things and fail the run.

  • Note: AlphaPilot is aware that it will be possible to back-out the exact global 3D locations of gates. Teams are not allowed to utilize this data in their algorithms, and any teams that do so will be flagged during judging.

Algorithms should be built upon the FlightGoggles ROS framework. Teams have the flexibility to utilize any programming language in ROS nodes. Teams may leverage open-source algorithms and data libraries to assist in their design. The written submissions should document important elements of your approach (e.g. language, architecture) and explain why you believe this approach is both favorable for performance in Test #3 and scalable to the larger AlphaPilot challenge.

Entries to both the Leaderboard and Final Exam must be submitted using the Test #3 entry form provided on the AlphaPilot HeroX website. Entries for the Leaderboard Challenge Course must be a single Scores YAML file and submitted as an attachment. This file contains scores from all 25 Results YAML files and is used by HeroX to calculate teams Leaderboard Algorithm Score (where top 5 scores are found and averaged).

Algorithm entries for the Final Challenge Course (see more details about ‘Final Testing’ below) must be submitted executable programs as well as source code before the final deadline. The final submission form will contain fields for attachment of a zip file of your executable code, source code, and technical report.

See the ‘Testing’ section for more details about how algorithms will be tested.

Note: GPU and CUDA drivers will already be installed in the testing environment. The following instance will be used at the basis for the testing environment: https://aws.amazon.com/marketplace/pp/B077GCZ4GR

 

Technical Report Requirements:

In the Technical Report, teams must document how their algorithms work and how they conducted testing and analysis. Teams should detail their use of any libraries or well-known approaches. In addition to describing their approach, teams must also address:

  • How the team plans to build onto or modify their drone racing algorithms for use in the AlphaPilot competition should they qualify.
  • Any technical issues the team ran into and how they overcame them.

Reports need to be in PDF format and 2-pages maximum, single-spaced. Font size should be minimum 11pts. Insertion of text, equations, images, figures, plots, code, and pseudo-code is accepted but will be included in the 2-page limitation.

 

Testing:

The Leaderboard and Final Challenge Course are now out! Please note HeroX AlphaPilot will only accept leaderboard submissions at this time, and there is no need to submit source code until the final deadline.

The Leaderboard Challenge is comparable to the Final Challenge Course and will give teams a rough idea of how well they will ultimately perform. The Challenge Course YAML files are included in the given Leaderboard Challenge Files and represent the initial conditions for the drone and gate locations. Again, these vary slightly over the 25 runs, and team scores will be averaged over multiple runs.

 

Leaderboard Testing

Teams can now download the Leaderboard Challenge Course files (challenge_leaderboardtest.zip) and submit their Scores YAML file to the leaderboard for evaluation.

The Leaderboard Challenge Course folder contains the 25 YAML files which are slight variations on the above Challenge Course. Teams are expected to test their algorithms on each one of these YAML files and output a Results YAML file per run (as done by the Reporter node). Using the scoring scripts provided, teams can generate the Scores YAML file to submit to the Leaderboard. The Scores YAML file will read, the top 5 scores will be found and averaged, and this final score will be posted on the Leaderboard.

The first round on the leaderboard will close at Monday, Feb. 18th at 11:59PM PST, and any submissions will appear on the leaderboard by 9AM PST the following day. After this, the leaderboard will refresh daily until the final deadline. Teams can update their submission multiple times per day, but only their most recently uploaded YAML files will be evaluated and scored.

 

Final Testing

The Final Challenge Course is also based on the above Challenge Course, but teams will not receive the 25 Challenge YAML files, which again are slight variations on the initial conditions. Teams may assume that the stochastic changes will mirror those provided in Leaderboard Challenge Course Challenge YAML files.

By the final deadline, teams must submit both source code and binaries for evaluation on these 25 sequestered Challenge YAML files. Each team’s folder of source code and binaries will be unzipped onto an instance of Ubuntu 16.04.5 LTS running in AWS type p3.2xlarge. The following instance will be used as the basis for the testing environment: https://aws.amazon.com/marketplace/pp/B077GCZ4GR

Each team’s binaries will be run within this testing environment on all 25 Challenge YAML files and the scores will be reported in the Results YAML files. The scoring scripts provided will then be run to produce the Scores YAML file (as used in the Leaderboard Testing). Finally, the team’s final Test 3 Algorithm Score is calculated from their best 5 runs.

Separately, judges will read and score each team’s technical report, which will be subsequently used to calculate their total final score for Test 3.

 

Frequently Asked Questions (FAQs)

Can teams utilize the global 3D locations of the gates?

As is typical in drone racing, teams know approximately the initial drone and gate locations and must deal with slight variations during the actual race. Teams are challenged to create a single algorithm that can be tested on 25 slight variations of the same Challenge Course.

Teams are given Challenge YAML files that indicate the initial location of the drone and gates for each exam and on which team algorithms will be tested. For Leaderboard Testing, teams are given all 25 Challenge YAML Files and self-report their scores. For Final Testing, teams are not given the variations on the Challenge YAML Files, and their algorithms will be run and scored by HeroX.

So in general, no, teams cannot utilize the exact global 3D locations of the gates. However, they can approximate it from visual processing and other on-board feedback.

Note: AlphaPilot is aware that there is a hack that makes it possible to back-out the exact global 3D locations of gates. Teams are not allowed to utilize this data in their algorithms, and any teams that do so will be flagged during judging.

 

Will we have to do obstacle avoidance?

No. Visual-based obstacle avoidance is not needed (course paths are mostly clear of obstacles). That being said, it is possible to crash into things and fail the run.

 

Can teams utilize the stereo cameras?

Yes. You can use the photorealistic 60Hz RGB camera feeds – mono and stereo allowed.

 

Will you give teams information about the drone model?

More details about the drone (vehicle dynamics, sensor models, specifications, etc.) are added to the GitHub repo.

 

If you have any specific questions on Test #3, please comment in this forum thread.

powered by: