Organized by Anti-UAV - Current server time: Jan. 19, 2021, 9:20 p.m. UTC

First phase

Feb. 26, 2020, midnight UTC


Competition Ends
April 16, 2020, midnight UTC

The 1st Anti-UAV Workshop & Challenge

Description of the 1st Anti-UAV Challenge

Recently, unmanned aerial vehicle (UAV) is growing rapidly in a wide range of consumer communications and networks with their autonomy, flexibility, and a broad range of application domains. UAV applications offer possible civil and public domain applications in which single or multiple UAVs may be used. At the same time, we also need to be aware of the potential threat to our lives caused by UAV intrusion. Earlier this year, multiple instances of drone sightings halted air traffic at airports, leading to significant economic losses for airlines.This workshop focuses on state-of-the-art anti-UAV systems in a bid to safeguard flights.

The current computer vision research for UAV lacks a high-quality benchmark in dynamic environments. To mitigate this gap, this workshop presents a benchmark dataset and evaluation methodology for the area of detecting and tracking UAVs.The dataset consists of 160 high quality, Full HD video sequences (100 videos are used for test-dev and the rest are used for test-final), spanning multiple occurrences of multi-scale UAVs. This workshop also encourages participants to establish approaches to fully automatic detection and tracking of UAVs in videos.

This workshop will bring together academic and industrial experts in the field of UAVs to discuss the techniques and applications of tracking UAVs. Participants are invited to submit their original contributions, surveys, and case studies that address the works of UAV’s detection and tracking issues.

Topics of interest

The submissions are expected to deal with visual perception and processing tasks which include but are not limited to:

  • Applications of computer vision on UAVs
  • Strategies for searching of UAVs based on NIR and/or VIS data
  • Spectrum sensing techniques for UAVs detection
  • Localization and open-set identification of UAVs
  • Scene understanding for UAVs
  • Small/tiny object detection and tracking techniques
  • Fine-grained object recognition
  • Real-time deep learning inference
  • Infrared image and video analysis
  • Multimodal fusion techniques


Please feel free to send any question or comments to:,

Details can be referred to



You can downlod this dataset at Baidu Yun (sagx) / Google Drive

News: * 2020/02/19: add IR_label.json


We define the tracking accuracy as:


acc = Σt(IoUt*δ(vt>0)+ pt(1-δ(vt>0)))/T

The IoU_i is Intersection over Union (IoU) between each corresponding ground truth and tracking boxes and the v are the visibility flags of the ground truth (the tracker’s predicted p are used to measure the state accuracy). The accuracy is averaged over all frames.

We provide Thermal Infrared (IR), RGB videos and their ground-truth labels. Contestants can only use both IR and RGB videos and their ground-truth location in the first frame. Our evaluation ranks are calculated according to the results on the IR video.

Please refer to our [baseline]( code.


Submissions must be made before the end of phase 1. There are only 5 submissions in the Final phase.


Start: Feb. 26, 2020, midnight

Description: Development phase: create models and submit them or directly submit results on validation and/or test data; feed-back are provided on the validation set only.


Start: April 8, 2020, midnight

Description: Final phase: submissions from the previous phase are automatically cloned and used to compute the final score. The results on the test set will be revealed when the organizers make them available. There are only 5 submissions in the Final phase.

Competition Ends

April 16, 2020, midnight

You must be logged in to participate in competitions.

Sign In