

ACT Autonomous Surface Vehicle Workshop Report
Page
15
of
34
logging of data at prescribed points within a sequential mission plan. For example, a
sonar might be triggered to log data at the start of each survey line and to stop logging
when the line is completed, or to power off systems completely, as prescribed in a
mission plan, for example to actively manage power consumption. At the Basic
Autonomy level these actions are programmed by the operator as part of the mission
plan rather than undertaken by the vehicle itself, except perhaps when load shedding in
emergency circumstances. Basic autonomy may also include the ability to time-stamp
and log data from sensors that do not have their own native logging capability (i.e. they
simply produce data when activated), and managing those log files along with other
mission logs (e.g. rotating logs when appropriate, organizing logs by mission, etc.) Basic
Autonomy is distinguished from Intermediate Autonomy in that sensors must be
manually configured and reconfigured as necessary.
Level 3: Intermediate Autonomy (do as you’re told and react to what’s known)
Self-awareness: Intermediate Autonomy involves the implementation of models of
vehicle performance that are informed by sensor inputs in real-time to provide a more
complete estimate of the vehicle’s state. The obvious example, is that of an Extended
Kalman Filter operating on various (and possibly multiple) measures of position,
velocity, acceleration and orientation, to estimate the complete pose of the vehicle.
However other simpler models are possible, for example knowledge of the vehicle’s turn
radius such that Williamson turns can be executed when waypoints fall within it, or fuel
consumption models for various speeds that allow the vehicle to predict its ability to
complete a mission.
Operations: Intermediate Autonomy involves the ability to adjust a pre-planned mission
in a reactionary way to fixed (i.e. not dynamically sensed) input according to fixed rules,
for example, to avoid shallow water, charted hazards to navigation, a polygon of
prohibited operational area. Intermediate Autonomy is distinguished from Basic
Autonomy in that the effect is a behavioral response to a set of conditions in addition to
accomplishing a set of tasks in sequence. This level of autonomy is distinguished from
Advanced Autonomy in that it applies a fixed set of rules to a fixed set of conditions in
which the data used to evaluate the conditions is not actively sensed in real-time, but
rather is known
a priori
.
Sensor: Intermediate Autonomy involves the ability to programmatically configure or
reconfigure a sensor, setting values programmatically based those specified within the
mission plan at defined points along the route. Examples of programmatically specified
settings might include specifying within the mission plan an increase in side-scan sonar
maximum range, knowing that the survey will progress into deeper water, Intermediate
Autonomy is distinguished from Advanced Autonomy in that configuration (or
reconfiguration) of the sensors is fixed, specified a priori in the mission plan, rather than
adjusted according to sensed input.
Level 4: Advanced Autonomy (do as you’re told, sense and react to what’s not known)
Self-awareness: Advanced Autonomy involves the ability to recognize when the vehicle’s
movement or other sensed parameters do not fit the expected model and to react to