Build the autonomy and perception systems that enable our interceptors to detect, track, and defeat threats in contested environments.
About Praetorian
Praetorian Aeronautics builds advanced autonomous aerial systems and AI-enabled command and control to defend against the rapidly evolving threat of drone warfare.
Headquartered in Adelaide, we are developing an integrated ecosystem of counter-autonomy systems:
-
Arrow – a high-speed VTOL interceptor designed for beyond-visual-range kinetic neutralisation of multiple UxS threats
-
Dagger – a modular interceptor for long-range, high-altitude kinetic neutralisation of UxS targets
-
Hadrian – an AI-enhanced command and control system enabling operators to deploy interceptors at scale
-
Venator – a long-endurance autonomous mothership UAS that deploys interceptors and extends operational reach
Together, these systems enable defence operators to detect, assess, and defeat autonomous threats while maintaining situational dominance in contested environments.
Why This Role Matters
Autonomous interceptors operating beyond visual range in GPS-denied, communications-degraded environments cannot rely on human input at the moment of engagement. Everything depends on what is running onboard — the perception pipeline, the mission execution logic, the autonomy stack that holds it all together under real-world constraints.
This role owns that layer. You will take advanced GNC and AI algorithms and make them work reliably on real hardware, in real environments, against real targets. The bridge between research capability and fielded mission system runs through this role.
The Role
You will join the Effectors team, reporting to the CTO, to lead development and deployment of embedded autonomy and perception systems across Praetorian’s interceptor platforms.
Working at the intersection of embedded systems, computer vision, and edge AI, you will own the full stack from camera integration and tracking pipelines through to onboard mission execution and autonomous failover. Your work will be validated in simulation, proven in SITL/HITL, and tested in live flight environments.
What You’ll Do
- Lead development of onboard computer vision and
multi-object tracking pipelines for real-time aerial target detection and track
maintenance
- Deploy and optimise AI-enabled perception systems onto
embedded and edge-compute platforms under latency and compute constraints
- Integrate autonomy stacks with PX4/ArduPilot, onboard
compute, and EO/IR payload systems across interceptor platforms
- Develop and validate mission execution behaviours for
comms-degraded and disconnected operations
- Build and maintain SITL/HITL environments and lead
embedded autonomy testing campaigns through to live flight trials
- Work closely with the Lead GNC Engineer to integrate
guidance outputs into deployable onboard mission systems
What We’re Looking For
- Strong experience developing and deploying computer
vision, detection, and tracking systems in real-world environments
- Hands-on embedded systems experience — onboard compute,
camera and sensor integration, edge AI deployment
- Proficiency in Python and C/C++
- Experience with SITL/HITL environments and progression through to live flight testing
- Degree in Computer Science, Robotics, Aerospace, or related field — or equivalent practical experience
Nice to Have
- Experience with PX4, ArduPilot, or MAVLink ecosystems
- Background in EO/IR payload integration
- Familiarity with AI/ML tooling — ClearML, CVAT, or
similar
- Exposure to Unreal Engine or synthetic environment
generation
- Experience with resilient autonomy in degraded
communications environments
- Defence or aerospace autonomy background