Glossary of Terms

Command Area: The location for the Ground Control Station, Command Server, Takeoff and Landing Zone, and members of the team who are involved in the scored mission (excluding the Remote Pilot in Command). It may be assumed that devices within the Command Area (including the UAV while it is in the Landing Zone) will have unobstructed radio communications with each other. For Stage 3 the Command Area may be in a location without Visual Line of Sight and/or radio communications with parts of the flight area. 

 

Command Server: A device, provided by the competition organizers, consisting of an Wi-Fi access point operating in the 2.4 GHz and/or 5.8 GHz bands, and servers as described elsewhere in this document. During the scored mission, the team’s UAS will connect to the Command Server and provide information that the UAV has gathered. It is up to the team if the UAV directly connects to the Command Server, or if the UAV connects to the Ground Control Station, which then relays the information to the Command Server. 

 

Ground Control Station: A stand-alone device, optionally but not necessarily hand-held, provided by the team. In a real response situation and in the presence of the appropriate Certificates of Authorizations or waivers, this would be the main/only controller for the UAS. During the Stage 3 competition, an FAA Part 107 Remote Pilot in Command shall hand over control to the Ground Control Station to begin the scored mission, during which time the team will interact with, and obtain scored data from, the UAS through the Ground Control Station. 

 

Measurement: The actual number or numbers for the metric(s) that result from a test method. For example, the measurement of the Time metric in the Endurance test for a particular system configuration might be 20 minutes. 

 

Metric: The name of the measurement that is generated by a test method. A test method may generate more than one metric. For example, the Endurance test generates two metrics, Iterations and Time. 

 

Subject matter expert (SME): an expert in their respective field, either from NIST or from a collaborating entity. SMEs will conduct independent reviews of the submissions received from the challenge. SMEs are not members of the judging panel and, as such, will provide recommendations based on the evaluation criteria to the judging panel and will not make any award determinations.

NIST PSCR will select members from the public safety industry, first responders, and PSCR to test and evaluate the submissions for the challenge. The judging panel will take SMEs’ recommendations into consideration when evaluating contestants’ submissions. The judging panel will make the final determination of awards for the challenge.

 

Target Object of Interest: The Stage 2 tests of Inspect and Download Data and Survey Acuity, and the Stage 3 Scenarios, involve the UAV inspecting and/or downloading data from ground sensors referred to in this competition as Target Objects of Interest. These represent objects from which the UAV is seeking and need to observe and download data. These consist of an apparatus that the UAV should inspect with its camera and a co-located radio transmitter from which the UAV should connect to and download data.

 

Test Method (test): The specification for apparatus, procedure, and metrics that evaluate a particular capability, to some known or estimated level of statistical significance. 

 

Uncrewed Aircraft System (UAS): The UAV and associated team-provided equipment, including the Ground Control Station, Supervisory Controller, and support equipment. 

 

Uncrewed Aerial Vehicle (UAV): The singular, flying, uncrewed aircraft, including only the portions that are airborne. This may be a multi-rotor vehicle, a fixed-wing vehicle, or any other format that is compatible with FAA Part 107 regulations, as long as it satisfies the other requirements in this document.

 

 

Team NavigateIO is the Stage 2 midstage winner!

X