Autonomous systems (AS) are systems that can adapt and change their behavior
in response to unanticipated events and include systems such as aerial drones,
autonomous vehicles, and ground/aquatic robots. AS require a wide array of
sensors, deep-learning models, and powerful hardware platforms to perceive and
safely operate in real-time. However, in many contexts, some sensing modalities
negatively impact perception while increasing the system's overall energy
consumption. Since AS are often energy-constrained edge devices,
energy-efficient sensor fusion methods have been proposed. However, existing
methods either fail to adapt to changing scenario conditions or to optimize
energy efficiency system-wide. We propose CARMA: a context-aware sensor fusion
approach that uses context to dynamically reconfigure the computation flow on a
Field-Programmable Gate Array (FPGA) at runtime. By clock-gating unused sensors
and model sub-components, CARMA significantly reduces the energy used by a
multi-sensory object detector without compromising performance. We use a
Deep-learning Processor Unit (DPU) based reconfiguration approach to minimize
the latency of model reconfiguration. We evaluate multiple
context-identification strategies, propose a novel system-wide
energy-performance joint optimization, and evaluate scenario-specific
perception performance. Across challenging real-world sensing contexts, CARMA
outperforms state-of-the-art methods with up to 1.3x speedup and 73% lower
energy consumption.Comment: Accepted to be published in the 2023 ACM/IEEE International Symposium
on Low Power Electronics and Design (ISLPED 2023