101,675 research outputs found
Automatic Generation of Minimal Cut Sets
A cut set is a collection of component failure modes that could lead to a
system failure. Cut Set Analysis (CSA) is applied to critical systems to
identify and rank system vulnerabilities at design time. Model checking tools
have been used to automate the generation of minimal cut sets but are generally
based on checking reachability of system failure states. This paper describes a
new approach to CSA using a Linear Temporal Logic (LTL) model checker called BT
Analyser that supports the generation of multiple counterexamples. The approach
enables a broader class of system failures to be analysed, by generalising from
failure state formulae to failure behaviours expressed in LTL. The traditional
approach to CSA using model checking requires the model or system failure to be
modified, usually by hand, to eliminate already-discovered cut sets, and the
model checker to be rerun, at each step. By contrast, the new approach works
incrementally and fully automatically, thereby removing the tedious and
error-prone manual process and resulting in significantly reduced computation
time. This in turn enables larger models to be checked. Two different
strategies for using BT Analyser for CSA are presented. There is generally no
single best strategy for model checking: their relative efficiency depends on
the model and property being analysed. Comparative results are given for the
A320 hydraulics case study in the Behavior Tree modelling language.Comment: In Proceedings ESSS 2015, arXiv:1506.0325
Trajectory Design of Laser-Powered Multi-Drone Enabled Data Collection System for Smart Cities
This paper considers a multi-drone enabled data collection system for smart cities, where there are two kinds of drones, i.e., Low Altitude Platforms (LAPs) and a High Altitude Platform (HAP). In the proposed system, the LAPs perform data collection tasks for smart cities and the solar-powered HAP provides energy to the LAPs using wireless laser beams. We aim to minimize the total laser charging energy of the HAP, by jointly optimizing the LAPs’ trajectory and the laser charging duration for each LAP, subject to the energy capacity constraints of the LAPs. This problem is formulated as a mixed-integer and non-convex Drones Traveling Problem (DTP), which is a combinatorial optimization problem and NP-hard. We propose an efficient and novel search algorithm named DronesTraveling Algorithm (DTA) to obtain a near-optimal solution. Simulation results show that DTA can deal with the large scale DTP (i.e., more than 400 data collection points) efficiently. Moreover, the DTA only uses 5 iterations to obtain the nearoptimal solution whereas the normal Genetic Algorithm needs nearly 10000 iterations and still fails to obtain an acceptable solution
Chebyshev polynomial filtered subspace iteration in the Discontinuous Galerkin method for large-scale electronic structure calculations
The Discontinuous Galerkin (DG) electronic structure method employs an
adaptive local basis (ALB) set to solve the Kohn-Sham equations of density
functional theory (DFT) in a discontinuous Galerkin framework. The adaptive
local basis is generated on-the-fly to capture the local material physics, and
can systematically attain chemical accuracy with only a few tens of degrees of
freedom per atom. A central issue for large-scale calculations, however, is the
computation of the electron density (and subsequently, ground state properties)
from the discretized Hamiltonian in an efficient and scalable manner. We show
in this work how Chebyshev polynomial filtered subspace iteration (CheFSI) can
be used to address this issue and push the envelope in large-scale materials
simulations in a discontinuous Galerkin framework. We describe how the subspace
filtering steps can be performed in an efficient and scalable manner using a
two-dimensional parallelization scheme, thanks to the orthogonality of the DG
basis set and block-sparse structure of the DG Hamiltonian matrix. The
on-the-fly nature of the ALBs requires additional care in carrying out the
subspace iterations. We demonstrate the parallel scalability of the DG-CheFSI
approach in calculations of large-scale two-dimensional graphene sheets and
bulk three-dimensional lithium-ion electrolyte systems. Employing 55,296
computational cores, the time per self-consistent field iteration for a sample
of the bulk 3D electrolyte containing 8,586 atoms is 90 seconds, and the time
for a graphene sheet containing 11,520 atoms is 75 seconds.Comment: Submitted to The Journal of Chemical Physic
Nanobodies as tools to understand, diagnose, and treat African trypanosomiasis
African trypanosomes are strictly extracellular protozoan parasites that cause diseases in humans and livestock and significantly affect the economic development of sub-Saharan Africa. Due to an elaborate and efficient (vector)-parasite-host interplay, required to complete their life cycle/transmission, trypanosomes have evolved efficient immune escape mechanisms that manipulate the entire host immune response. So far, not a single field applicable vaccine exists, and chemotherapy is the only strategy available to treat the disease. Current therapies, however, exhibit high drug toxicity and an increased drug resistance is being reported. In addition, diagnosis is often hampered due to the inadequacy of current diagnostic procedures. In the context of tackling the shortcomings of current treatment and diagnostic approaches, nanobodies (Nbs, derived from the heavy chain-only antibodies of camels and llamas) might represent unmet advantages compared to conventional tools. Indeed, the combination of their small size, high stability, high affinity, and specificity for their target and tailorability represents a unique advantage, which is reflected by their broad use in basic and clinical research to date. In this article, we will review and discuss (i) diagnostic and therapeutic applications of Nbs that are being evaluated in the context of African trypanosomiasis, (ii) summarize new strategies that are being developed to optimize their potency for advancing their use, and (iii) document on unexpected properties of Nbs, such as inherent trypanolytic activities, that besides opening new therapeutic avenues, might offer new insight in hidden biological activities of conventional antibodies
African Trypanosomes undermine humoral responses and vaccine development : link with inflammatory responses?
African trypanosomosis is a debilitating disease of great medical and socioeconomical importance. It is caused by strictly extracellular protozoan parasites capable of infecting all vertebrate classes including human, livestock, and game animals. To survive within their mammalian host, trypanosomes have evolved efficient immune escape mechanisms and manipulate the entire host immune response, including the humoral response. This report provides an overview of how trypanosomes initially trigger and subsequently undermine the development of an effective host antibody response. Indeed, results available to date obtained in both natural and experimental infection models show that trypanosomes impair homeostatic B-cell lymphopoiesis, B-cell maturation and survival and B-cell memory development. Data on B-cell dysfunctioning in correlation with parasite virulence and trypanosome-mediated inflammation will be discussed, as well as the impact of trypanosomosis on heterologous vaccine efficacy and diagnosis. Therefore, new strategies aiming at enhancing vaccination efficacy could benefit from a combination of (i) early parasite diagnosis, (ii) anti-trypanosome (drugs) treatment, and (iii) anti-inflammatory treatment that collectively might allow B-cell recovery and improve vaccination
NFV Based Gateways for Virtualized Wireless Sensors Networks: A Case Study
Virtualization enables the sharing of a same wireless sensor network (WSN) by
multiple applications. However, in heterogeneous environments, virtualized
wireless sensor networks (VWSN) raises new challenges such as the need for
on-the-fly, dynamic, elastic and scalable provisioning of gateways. Network
Functions Virtualization (NFV) is an emerging paradigm that can certainly aid
in tackling these new challenges. It leverages standard virtualization
technology to consolidate special-purpose network elements on top of commodity
hardware. This article presents a case study on NFV based gateways for VWSNs.
In the study, a VWSN gateway provider, operates and manages an NFV based
infrastructure. We use two different brands of wireless sensors. The NFV
infrastructure makes possible the dynamic, elastic and scalable deployment of
gateway modules in this heterogeneous VWSN environment. The prototype built
with Openstack as platform is described
Combinatorial persistency criteria for multicut and max-cut
In combinatorial optimization, partial variable assignments are called
persistent if they agree with some optimal solution. We propose persistency
criteria for the multicut and max-cut problem as well as fast combinatorial
routines to verify them. The criteria that we derive are based on mappings that
improve feasible multicuts, respectively cuts. Our elementary criteria can be
checked enumeratively. The more advanced ones rely on fast algorithms for upper
and lower bounds for the respective cut problems and max-flow techniques for
auxiliary min-cut problems. Our methods can be used as a preprocessing
technique for reducing problem sizes or for computing partial optimality
guarantees for solutions output by heuristic solvers. We show the efficacy of
our methods on instances of both problems from computer vision, biomedical
image analysis and statistical physics
Design of an adiabatic demagnetization refrigerator for studies in astrophysics
An adiabatic demagnetization refrigerator was designed for cooling infrared bolometers for studies in astrophysics and aeronomy. The design was tailored to the requirements of a Shuttle sortie experiment. The refrigerator should be capable of maintaining three bolometers at 0.1 K with a 90% cycle. The advantage are of operations the bolometer at 0.1K. greater sensitivity, faster response time, and the ability to use larger bolometer elements without compromising the response time. The design presented is the first complete design of an ADR intended for use in space. The most important of these specifications are to survive a Shuttle launch, to operate with 1.5 K - 2.0 K space-pumped liquid helium as a heat sink, to have a 90% duty cycle, and to be highly efficient
Integration of HeartSmart Kids into Clinical Practice: A Quality Improvement Project
Presented to the Faculty
of the University of Alaska, Anchorage
in partial fulfillment of requirements
for the degree of
MASTER OF SCIENCE, FAMILY PRACTICE NURSEIn 2009, the Centers for Medicare & Medicaid (CMS), established “Meaningful Use”
regulations through an incentive program, as part of the American Recovery and Reinvestment
Act of 2009 (Gance-Cleveland, Gilbert, Gilbert, Dandreaux, & Russell, 2014). Meaningful Use
(MU) is tied to reimbursement and focuses on how the Electronic Health Record (EHR) is being
used (Center for Disease Control and Prevention, 2012). The goal of MU is to transform the use
of the EHR from a documentation tool, to a data reservoir which allows for meaningful reviews
and interpretations of the quality of care (Gance-Cleveland et al, 2014).Project / Background / Significance / Review of Literature / Problem Overview / Problem Statement / Purpose / Design / Method / Plan Do Study Act (PDSA) / Ethical Considerations / Significance to Nursing / Dissemination / Conclusion
- …