Exercise Objectives

CURSE is designed to test the following, in both UAS Remote Sensing Teams and Remote Sensing Cell:

  • Objective 1 - Validate Distributed UAS Data Collection
  • Objective 2 - Assess Data Transfer and Integration to the Remote Sensing Cell (RSC)
  • Objective 3 - Evaluate AI-Supported Damage Assessment and Analytics Integration
  • Objective 4 - Measure the Effectiveness of the RSC
  • Objective 5 - Identify Gaps in Doctrine, Staffing, and Technology

Complete Objective information is available further down this page.

Who Should Partipate?

UAS Remote Sensing Teams, large and small, that may be tasked to support a local, regional, or statewide disaster or critical incident will gain valuable experience from CURSE 26. Due to the distributed format, travel is limited or non-existent, reducing the financial impact on agencies.

Past CURSE participants (in 2023 and 2024) included local, county, State and National agency partners.

Registration Form

If your agency or organization would like to participate, complete the form below. When completing the form, if you are willing to host nearby teams at your location please indicate that.

Participant Interest Form

Contact Us

If you have questions, please reach out.

curse26@uaseoc.org

Summary

CURSE 2026 is a full‑scale exercise designed to evaluate the use of unmanned aircraft systems (UAS) and remote sensing workflows during a catastrophic incident. The exercise will be conducted on May 19, 2026, and is developed and conducted by the Emergency Management and Homeland Security (EMHS) Program at Florida State University in coordination with state, local, and federal partners.

Unlike traditional co‑located exercises, CURSE 26 employs a statewide distributed model. Participating agencies operate from their home jurisdictions, conducting UAS data collection and processing missions locally, while transmitting imagery and derived products to a centralized Remote Sensing Cell (RSC) located in Tallahassee. The RSC serves as the primary fusion, analysis, and coordination hub, integrating data from multiple sources into a common operating picture, and distributing products to stakeholders.

The exercise focuses on validating end‑to‑end remote sensing operations, including mission development and tasking, collaborative collection planning, resource management, data transfer, analytics, and product dissemination. CURSE 26 will utilize artificial intelligence algorithms developed by Texas A&M University and the NSF‑funded AI Institute for Societal Decision Making to produce initial damage assessment reports. These tools will be evaluated for their ability to accelerate analysis and improve situational awareness under realistic operational conditions.

The outcomes of CURSE 26 will inform after‑action reporting, improvement planning, applied research efforts, and future investments in UAS, remote sensing, and analytics capabilities.

Flight Locations

This exercise is a full-scale exercise, distributed across the State of Florida - agencies and organizations can participate anywhere in the state (or nation).  Information on available flight and mission locations will be published in the near future.

Participating agencies may travel to one of the exercise partner locations, or work with the exercise design team to include their home jurisdiction in the exercise. Agencies are encouraged to coordinate with other local agencies to develop exercise play areas that can be utilized by multiple teams.

Register

Those interested in playing are encouraged to fill out the Participant Interest Form. If you have questions, contact curse26@uaseoc.org.


Exercise Information

Schedule of Events

TBD - Scenario Planning Conference

May 18th, 1300 - Exercise Participant Briefing

May 19th, 0900 - Full Scale Exercise

Invitation Letter

[ Coming Soon ]

List of Participants

[ Coming Soon ]

Development Workshops

As we move toward the exercise, FSU EMHS will host a series of workshops designed to reinforce existing training and procedures on a variety of collection and processing topics.

Disaster Mapping Workshop

Date/Time: TBD
Topics: Mapping post-disaster, including workflows, collection requirements, processing, and coordination.

Air Operations Branch Workshop

Date/Time: TBD
Topics: Discussion on domain awareness, airspace management, communication, and coordination.

Damage Assessment Workshop

Date/Time: TBD
Topics: Discussion on utilizing remote sensing products in damage assessment, including oblique imagery, video, orthomosaics, and AI.

Data Management Workshop

Date/Time: TBD
Topics: How to manage gigabytes (or terabytes) of data from a wide geographic area, and get products to decision makers.


Exercise Objectives

CURSE 26 exercise objectives align with the National Preparedness Goal core capabilities, with primary emphasis on Operational Coordination, Situational Assessment, Communications, and Planning. The exercise is designed not only to demonstrate current capabilities, but also to identify gaps in doctrine, staffing models, workflows, data standards, and technology that limit the ability to scale UAS‑enabled remote sensing during major disasters.

Objective 1: Validate Distributed UAS Data Collection

Purpose:
Evaluate the ability of geographically dispersed UAS teams to conduct data collection in support of a catastrophic incident without team co‑location or centralized staging areas.

Key Focus Areas:

  • Conduct independent field operations
  • Provide consistent mission execution
  • Place minimal logistical/financial burden on participating agencies

Evaluation Metrics:

  1. Participation Coverage
    • Number of agencies and UAS teams successfully executing assigned collection tasks from their distributed locations.
  2. Mission Completion Rate
    • Percentage of assigned remote sensing missions completed within the operational period.
  3. Operational Friction
    • Number and type of issues reported related to airspace coordination, hardware/software limitations, crew availability, or regulatory approvals.

Objective 2: Assess Data Transfer and Integration to the Remote Sensing Cell (RSC)

Purpose:
Evaluate the ability of participating teams to move collected imagery and data to the Remote Sensing Cell in near‑real time and integrate it into a common operating environment.

Key Focus Areas:

  • Evaluate network connectivity and tools
  • Test data and product management workflows
  • Ensure interoperability across agencies/teams

Metrics:

  1. Data Latency
    • Time from mission completion to data availability at the RSC.
  2. Successful Data Ingest
    • Number and percentage of datasets successfully received, cataloged, and made accessible to stakeholders at the RSC.
  3. Data Quality Threshold
    • Number and percentage of datasets meeting minimum resolution, coverage, and metadata standards required for analysis and distribution.

Objective 3: Evaluate AI‑Supported Damage Assessment and Analytics Integration

Purpose:
Test the operational integration of AI‑based damage assessment tools (e.g., CLARKE) into the remote sensing workflow to support rapid situational awareness and decision‑making.

Key Focus Areas:

  • Test AI model ingestion pipelines
  • Evaulate analyst interaction with AI outputs
  • Determine trust and usability of automated assessments

Metrics:

  1. Processing Success Rate
    • Number and percentage of datasets successfully processed through AI damage assessment workflows.
  2. Time to First Analytical Product
    • Elapsed time from data ingest to delivery of an initial AI‑generated damage assessment product.
  3. Analyst Confidence Feedback
    • Qualitative feedback from RSC analysts and stakeholders on usefulness, clarity, and trustworthiness of AI outputs.

Objective 4: Measure the Effectiveness of the Remote Sensing Cell

Purpose:
Assess the RSC’s ability to function as a centralized fusion and coordination element for distributed UAS operations, analytics, and product dissemination during a major statewide event.

Key Focus Areas:

  • Establish staffing needs and role clarity
  • Evaluate workflow coordination
  • Test product development, management, and distribution

Metrics:

  1. Product Throughput
    • Number of actionable products (maps, summaries, reports) generated during the exercise operational period.
  2. Coordination Effectiveness
    • Frequency and severity of workflow bottlenecks or handoff failures within the RSC.
  3. Stakeholder Satisfaction
    • Post‑exercise feedback from participating agency stakeholders on clarity, responsiveness, and usefulness of RSC products and information.

Objective 5: Identify Gaps in Doctrine, Staffing, and Technology

Purpose:
Define capability gaps that limit the effectiveness of UAS remote sensing operations during catastrophic incidents.

Key Focus Areas:

  • Evaluate UAS remote sensing team composition
  • Identify training needs
  • Identify technology, aircraft, and systems gaps

Metrics:

  1. Documented Capability Gaps
    • Number of recurring gaps identified across agencies (e.g., staffing, bandwidth, analysis capacity).
  2. Workarounds Required
    • Frequency of ad‑hoc solutions used to complete tasks, including discussion and summary of issues.