1 research outputs found

    Towards an agent-driven scenario awareness in remote sensing environments

    No full text
    In dynamic environments, autonomous and unmanned vehicle systems (UVSs) represent a reliable solution, especially when the request of high performance is a stringent constraint for complex and risky tasks, such as searching survival points, multiple target monitoring, and tracking, etc. In these cases, cooperative activities among all the involved UVSs are strategic for the achievement of a collective goal. When UVS teams work collaboratively, they collect heterogeneous data from multiple sources and bring benefits through an enhanced situational awareness (SA). Multi-UVS scenarios are, by their nature, easy to be modeled as multi-agent systems. This paper presents an agent-based modeling, governing different types of unmanned vehicles that are sent ahead in an area of interest to gather environmental, sensing, image data in order to provide a complete multi-view scenario understanding. The agent model is instantiated in each vehicle, and depending on the vehicle features, encapsulates a semantic mental modeler, customized for the specific vehicle features. The agents collect raw data from the environment and translate them into high-level knowledge, i.e., a conceptualization of the data semantics (i.e., a set of pixels assumes the meaning of a car). The proposed agent-based modeling lays on a synergy between Semantic Web technologies and Fuzzy Cognitive Map (FCM) models, producing a high-level description of the evolving scenes, and then a comprehensive scenario situational awareness
    corecore