plan vs actual review

Planned vs Actual Route Analysis

When mission plan or route geometry is available, this module compares planned and actual movement to show where execution diverged in time and space. It produces evidence-backed findings for evaluator review in AAR and post-exercise assessment.

KEY FINDINGS PRODUCED
  • Loss of route alignment relative to the plan
  • Waypoint omission or bypass
  • Duration of deviation and recovery
  • Timing of deviation and recovery
  • Areas where planned and actual movement diverged
Findings are descriptive and evidence-linked. They do not assign fault, automate grading, or replace evaluator judgment.
readiness intelligence

Why This Review Matters

Small route deviations can be easy to miss during an exercise and costly to reconstruct afterward. Clear comparison between plan and actual movement helps evaluators identify where execution changed, when it changed, and what should be addressed in AAR and post-exercise review.

What Becomes Visible

This module turns route comparison into evidence-backed findings by showing when and where execution diverged from the planned route, with supporting context for evaluator review.
Evaluator Findings
  • Where deviation began, not just where it ended
  • How long deviation persisted
  • Where recovery occurred or did not occur
  • Observable movement patterns associated with route divergence
What This Supports
  • More consistent after-action review
  • Reduced manual reconstruction
  • Shared reference points for evaluator discussion
  • More defensible post-exercise assessment

How Evaluators Use This Module

During AAR
  • Identify where deviation began within the review window
  • Compare planned and actual movement using shared reference points
  • Reduce subjective interpretation during debrief
During post-exercise assessment
  • Preserve contextual evidence for review
  • Anchor feedback in observable movement behavior
  • Support more defensible evaluation outputs
Integration snapshot

Integration & Deployment

Designed for integration into existing mission, training, and evaluation workflows.

All outputs are timestamped, traceable, and reproducible from source data.
Data Interface
Inputs
  • Mission plan or route geometry when available
  • Post-mission movement data
  • Mission metadata and timestamps
Outputs
  • Structured deviation events
  • Planned-versus-actual comparison objects
  • Evidence-backed findings for evaluator review
  • JSON outputs for downstream systems, with optional PDF review extract
Execution & Control
Deployment Models
  • Standalone module
  • Containerized service
  • API-ready integration
  • Customer-controlled deployment options
Configuration
  • Thresholds defined in configuration files
  • Adjustable deployment parameters
  • Core evaluation logic remains fixed and deterministic across deployments
Security
Deployment models and licensing are tailored by environment and mission constraints.
View Integration OptionsStart a Pilot Discussion