1,088 research outputs found
On-board emergent scheduling of autonomous spacecraft payload operations
This paper describes a behavioral competency level concerned with emergent scheduling of spacecraft payload operations. The level is part of a multi-level subsumption architecture model for autonomous spacecraft, and it functions as an action selection system for processing a spacecraft commands that can be considered as 'plans-as-communication'. Several versions of the selection mechanism are described, and their robustness is qualitatively compared
A Survey on Interpretable Cross-modal Reasoning
In recent years, cross-modal reasoning (CMR), the process of understanding
and reasoning across different modalities, has emerged as a pivotal area with
applications spanning from multimedia analysis to healthcare diagnostics. As
the deployment of AI systems becomes more ubiquitous, the demand for
transparency and comprehensibility in these systems' decision-making processes
has intensified. This survey delves into the realm of interpretable cross-modal
reasoning (I-CMR), where the objective is not only to achieve high predictive
performance but also to provide human-understandable explanations for the
results. This survey presents a comprehensive overview of the typical methods
with a three-level taxonomy for I-CMR. Furthermore, this survey reviews the
existing CMR datasets with annotations for explanations. Finally, this survey
summarizes the challenges for I-CMR and discusses potential future directions.
In conclusion, this survey aims to catalyze the progress of this emerging
research area by providing researchers with a panoramic and comprehensive
perspective, illuminating the state of the art and discerning the
opportunities
Recommended from our members
Boundary Dynamics of a Transformative Learning Network : Improving Connection at the Interface of Science and Society
Transformative learning networks are a specific type of loose network with geographically distributed members and member organizations. They hold particular promise for transformation when both top-down and bottom-up processes have failed to support desired systems-level change. The aim of this dissertation is to build knowledge about the social-interactional processes, roles, and practices that build transformative capacity of a network. I apply poststructural and interpretivist point of view to understand the dynamics of boundaries and boundary work in the National Alliance for Broader Impacts.
The meso-theory herein claims that two types of boundary work - building boundaries and navigating across boundaries - operate in productive tension to expand knowledge resources and increase network authority and influence in the system over time. This suggests that network leaders can dynamically manage boundaries, shifting emphasis between strength and fluidity to support multi-sited and multi-scalar change.
The primary claim of the applied research contribution is that a variety of both structures and interdependent roles and practices work in concert to support transformation across sites and scales. To support this claim, I detail common network substructures, across which critical practices occur and develop a typology of network practices in two distinct, but interdependent roles. Those in the sojourner role focus on site-based work to shift everyday norms. They demonstrate keen awareness of how their institutions enable and constrain change efforts and contribute that knowledge to the network. Those in an expert role, design networks to support meaningful member engagement opportunities across sites and at the same time build identity and coherence within the network to enable transformation at multiple scales. The expert and sojourner roles generally correspond with boundary building and boundary navigation respectively.
In addition to the focus on boundary dynamics in networks, this study also examines “Broader Impacts” as a path for connecting science and society in a time when the realms of science and other sectors of society need to come together to address increasingly complex social, educational, and environmental challenges. The final contribution describes a manifestation of one of many possible transformative pathways that emerged from and evolves within the network. The concept of helping scientists develop their “impact identity”, integrates scholarship in a scientific discipline with societal needs, personal preferences, capacities and skills, and one’s institutional context. I understand identity, or a scientists’ self-concept, as a productive driver that can improve outcomes for scientists and for society by bridging the gap between them through public engagement activities.
This body of work ties together the theory of morphogenesis from critical realism, boundary concepts from across disciplines, and the landscapes of practice conceptual framework. The aim is to expand understanding about the design and potential of learning networks, which disrupt the status quo to guide change in social-ecological and social-educational systems. The new theory and insights about structures, roles, and practices can inform network and transformation scholars across disciplines. Network leaders, designers, and evaluators can also apply this work to their practice
Toward a formal theory for computing machines made out of whatever physics offers: extended version
Approaching limitations of digital computing technologies have spurred
research in neuromorphic and other unconventional approaches to computing. Here
we argue that if we want to systematically engineer computing systems that are
based on unconventional physical effects, we need guidance from a formal theory
that is different from the symbolic-algorithmic theory of today's computer
science textbooks. We propose a general strategy for developing such a theory,
and within that general view, a specific approach that we call "fluent
computing". In contrast to Turing, who modeled computing processes from a
top-down perspective as symbolic reasoning, we adopt the scientific paradigm of
physics and model physical computing systems bottom-up by formalizing what can
ultimately be measured in any physical substrate. This leads to an
understanding of computing as the structuring of processes, while classical
models of computing systems describe the processing of structures.Comment: 76 pages. This is an extended version of a perspective article with
the same title that will appear in Nature Communications soon after this
manuscript goes public on arxi
Unified Behavior Framework for Reactive Robot Control in Real-Time Systems
Endeavors in mobile robotics focus on developing autonomous vehicles that operate in dynamic and uncertain environments. By reducing the need for human-in- the-loop control, unmanned vehicles are utilized to achieve tasks considered dull or dangerous by humans. Because unexpected latency can adversely affect the quality of an autonomous system\u27s operations, which in turn can affect lives and property in the real-world, their ability to detect and handle external events is paramount to providing safe and dependable operation. Behavior-based systems form the basis of autonomous control for many robots. This thesis presents the unified behavior framework, a new and novel approach which incorporates the critical ideas and concepts of the existing reactive controllers in an effort to simplify development without locking the system developer into using any single behavior system. The modular design of the framework is based on modern software engineering principles and only specifies a functional interface for components, leaving the implementation details to the developers. In addition to its use of industry standard techniques in the design of reactive controllers, the unified behavior framework guarantees the responsiveness of routines that are critical to the vehicle\u27s safe operation by allowing individual behaviors to be scheduled by a real-time process controller. The experiments in this thesis demonstrate the ability of the framework to: 1) interchange behavioral components during execution to generate various global behavior attributes; 2) apply genetic programming techniques to automate the discovery of effective structures for a domain that are up to 122 percent better than those crafted by an expert; and 3) leverage real-time scheduling technologies to guarantee the responsiveness of time critical routines regardless of the system\u27s computational load
Recommended from our members
Construction of a support tool for the design of the activity structures based computer system architectures
This thesis was submitted for the degree of Doctor of Philosophy and was awarded by Brunel University.This thesis is a reapproachment of diverse design concepts, brought to bear upon the computer system
engineering problem of identification and control of highly constrained multiprocessing (HCM)
computer machines. It contributes to the area of meta/general systems methodology, and brings
a new insight into the design formalisms, and results afforded by bringing together various design
concepts that can be used for the construction of highly constrained computer system architectures.
A unique point of view is taken by assuming the process of identification and control of HCM
computer systems to be the process generated by the Activity Structures Methodology (ASM).
The research in ASM has emerged from the Neuroscience research, aiming at providing the
techniques for combining the diverse knowledge sources that capture the 'deep knowledge' of this
application field in an effective formal and computer representable form. To apply the ASM design
guidelines in the realm of the distributed computer system design, we provide new design definitions
for the identification and control of such machines in terms of realisations. These realisation definitions
characterise the various classes of the identification and control problem. The classes covered
consist of:
1. the identification of the designer activities,
2. the identification and control of the machine's distributed structures of behaviour,
3. the identification and control of the conversational environment activities (i.e. the randomised/
adaptive activities and interactions of both the user and the machine environments),
4. the identification and control of the substrata needed for the realisation of the machine, and
5. the identification of the admissible design data, both user-oriented and machineoriented,
that can force the conversational environment to act in a self-regulating
manner.
All extent results are considered in this context, allowing the development of both necessary
conditions for machine identification in terms of their distributed behaviours as well as the substrata
structures of the unknown machine and sufficient conditions in terms of experiments on the unknown
machine to achieve the self-regulation behaviour.
We provide a detailed description of the design and implementation of the support software tool
which can be used for aiding the process of constructing effective, HCM computer systems, based
on various classes of identification and control. The design data of a highly constrained system, the
NUKE, are used to verify the tool logic as well as the various identification and control procedures.
Possible extensions as well as future work implied by the results are considered.Government of Ira
Artificial Intelligence Techniques for Automatic Reformulation and Solution of Structured Mathematical Models
Complex, hierarchical, multi-scale industrial and natural systems generate increasingly large mathematical models.
Practitioners are usually able to formulate such models in their "natural" form; however, solving them often
requires finding an appropriate reformulation to reveal structures in the model which make it possible to
apply efficient, specialized approaches. The search for the "best" formulation of a given problem, the one which
allows the application of the solution algorithm that best exploits the available computational resources, is currently
a painstaking process which requires considerable work by highly skilled personnel. Experts in solution algorithms are
required for figuring out which (formulation, algorithm) pair is better used, considering issues like the appropriate
selection of the several obscure algorithmic parameters that each solution methods has. This process is only going to
get more complex, as current trends in computer technology dictate the necessity to develop complex parallel approaches
capable of harnessing the power of thousands of processing units, thereby adding another layer of complexity in the form
of the choice of the appropriate (parallel) architecture. All this renders the use of mathematical models exceedingly
costly and difficult for many potentially fruitful applications. The \name{} environment, proposed in this Thesis, aims
at devising a software system for automatizing the search for the best combination of (re)formulation, solution
algorithm and its parameters (comprised the computational architecture), until now a firm domain of human intervention,
to help practitioners bridging the gap between mathematical models cast in their natural form and existing solver
systems. I-DARE deals with deep and challenging issues, both from the theoretical and from an implementative viewpoint:
1) the development of a language that can be effectively used to formulate large-scale structured mathematical
models and the reformulation rules that allow to transform a formulation into a different one; 2) a core subsystem
capable of automatically reformulating the models and searching in the space of (formulations, algorithms,
configurations) able to "the best" formulation of a given problem; 3) the design of a general interface for numerical
solvers that is capable of accommodate and exploit structure information.
To achieve these goals I-DARE will propose a sound and articulated integration of different programming paradigms and
techniques like, classic Object-Oriented programing and Artificial Intelligence (Declarative Programming, Frame-Logic,
Higher-Order Logic, Machine Learning). By tackling these challenges, I-DARE may have profound, lasting and disruptive
effects on many facets of the development and deployment of mathematical models and the corresponding solution
algorithms
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership
VERDICTS: Visual Exploratory Requirements Discovery and Injection for Comprehension and Testing of Software
We introduce a methodology and research tools for visual exploratory software analysis. VERDICTS combines exploratory testing, tracing, visualization, dynamic discovery and injection of requirements specifications into a live quick-feedback cycle, without recompilation or restart of the system under test. This supports discovery and verification of software dynamic behavior, software comprehension, testing, and locating the defect origin. At its core, VERDICTS allows dynamic evolution and testing of hypotheses about requirements and behavior, by using contracts as automated component verifiers.
We introduce Semantic Mutation Testing as an approach to evaluate concordance of automated verifiers and the functional specifications they represent with respect to existing implementation. Mutation testing has promise, but also has many known issues. In our tests, both black-box and white-box variants of our Semantic Mutation Testing approach performed better than traditional mutation testing as a measure of quality of automated verifiers
- …