1,955 research outputs found

    Control of Networked Multiagent Systems with Uncertain Graph Topologies

    Full text link
    Multiagent systems consist of agents that locally exchange information through a physical network subject to a graph topology. Current control methods for networked multiagent systems assume the knowledge of graph topologies in order to design distributed control laws for achieving desired global system behaviors. However, this assumption may not be valid for situations where graph topologies are subject to uncertainties either due to changes in the physical network or the presence of modeling errors especially for multiagent systems involving a large number of interacting agents. Motivating from this standpoint, this paper studies distributed control of networked multiagent systems with uncertain graph topologies. The proposed framework involves a controller architecture that has an ability to adapt its feed- back gains in response to system variations. Specifically, we analytically show that the proposed controller drives the trajectories of a networked multiagent system subject to a graph topology with time-varying uncertainties to a close neighborhood of the trajectories of a given reference model having a desired graph topology. As a special case, we also show that a networked multi-agent system subject to a graph topology with constant uncertainties asymptotically converges to the trajectories of a given reference model. Although the main result of this paper is presented in the context of average consensus problem, the proposed framework can be used for many other problems related to networked multiagent systems with uncertain graph topologies.Comment: 14 pages, 2 figure

    CyberCraft: Protecting Electronic Systems with Lightweight Agents

    Get PDF
    The United States military is seeking new and innovative methods for securing and maintaining its computing and network resources locally and world-wide. This document presents a work-in-progress research thrust toward building a system capable of meeting many of the US military’s network security and sustainment requirements. The system is based on a Distributed Multi-Agent System (DMAS), that is secure, small, and scalable to the large networks found in the military. It relies on a staged agent architecture capable of dynamic configuration to support changing mission environments. These agents are combined into Hierarchical Peer-to-Peer (HP2P) networks to provide scalable solutions. They employ Public Key Infrastructure (PKI) communications (with digital signatures), and support trust chain management concepts. This document, a work-in-progress, presents the motivation and current challenges in choosing a network communications architecture capable of supporting one million or more agents in a DMAS

    Timing Mark Detection on Nuclear Detonation Video

    Get PDF
    During the 1950s and 1960s the United States conducted and filmed over 200 atmospheric nuclear tests establishing the foundations of atmospheric nuclear detonation behavior. Each explosion was documented with about 20 videos from three or four points of view. Synthesizing the videos into a 3D video will improve yield estimates and reduce error factors. The videos were captured at a nominal 2500 frames per second, but range from 2300-3100 frames per second during operation. In order to combine them into one 3D video, individual video frames need to be correlated in time with each other. When the videos were captured a timing system was used that shined light in a video every 5 milliseconds creating a small circle exposed in the frame. This paper investigates several method of extracting the timing from images in the cases when the timing marks are occluded and washed out, as well as when the films are exposed as expected. Results show an improvement over past techniques. For normal videos, occluded videos, and washed out videos, timing is detected with 99.3%, 77.3%, and 88.6% probability with a 2.6%, 11.3%, 5.9% false alarm rate, respectively

    Machine Learning Nuclear Detonation Features

    Get PDF
    Nuclear explosion yield estimation equations based on a 3D model of the explosion volume will have a lower uncertainty than radius based estimation. To accurately collect data for a volume model of atmospheric explosions requires building a 3D representation from 2D images. The majority of 3D reconstruction algorithms use the SIFT (scale-invariant feature transform) feature detection algorithm which works best on feature-rich objects with continuous angular collections. These assumptions are different from the archive of nuclear explosions that have only 3 points of view. This paper reduces 300 dimensions derived from an image based on Fourier analysis and five edge detection algorithms to a manageable number to detect hotspots that may be used to correlate videos of different viewpoints for 3D reconstruction. Furthermore, experiments test whether histogram equalization improves detection of these features using four kernel sizes passed over these features. Dimension reduction using principal components analysis (PCA), forward subset selection, ReliefF, and FCBF (Fast Correlation-Based Filter) are combined with a Mahalanobis distance classifiers to find the best combination of dimensions, kernel size, and filtering to detect the hotspots. Results indicate that hotspots can be detected with hit rates of 90% and false alarms ¡ 1%

    Activation of the Listeria monocytogenes Virulence Program by a Reducing Environment.

    Get PDF
    Upon entry into the host cell cytosol, the facultative intracellular pathogen Listeria monocytogenes coordinates the expression of numerous essential virulence factors by allosteric binding of glutathione (GSH) to the Crp-Fnr family transcriptional regulator PrfA. Here, we report that robust virulence gene expression can be recapitulated by growing bacteria in a synthetic medium containing GSH or other chemical reducing agents. Bacteria grown under these conditions were 45-fold more virulent in an acute murine infection model and conferred greater immunity to a subsequent lethal challenge than bacteria grown in conventional media. During cultivation in vitro, PrfA activation was completely dependent on the intracellular levels of GSH, as a glutathione synthase mutant (ΔgshF) was activated by exogenous GSH but not reducing agents. PrfA activation was repressed in a synthetic medium supplemented with oligopeptides, but the repression was relieved by stimulation of the stringent response. These data suggest that cytosolic L. monocytogenes interprets a combination of metabolic and redox cues as a signal to initiate robust virulence gene expression in vivoIMPORTANCE Intracellular pathogens are responsible for much of the worldwide morbidity and mortality from infectious diseases. These pathogens have evolved various strategies to proliferate within individual cells of the host and avoid the host immune response. Through cellular invasion or the use of specialized secretion machinery, all intracellular pathogens must access the host cell cytosol to establish their replicative niches. Determining how these pathogens sense and respond to the intracellular compartment to establish a successful infection is critical to our basic understanding of the pathogenesis of each organism and for the rational design of therapeutic interventions. Listeria monocytogenes is a model intracellular pathogen with robust in vitro and in vivo infection models. Studies of the host-sensing and downstream signaling mechanisms evolved by L. monocytogenes often describe themes of pathogenesis that are broadly applicable to less tractable pathogens. Here, we describe how bacteria use external redox states as a cue to activate virulence

    Structured P2P Technologies for Distributed Command and Control

    Get PDF
    The utility of Peer-to-Peer (P2P) systems extends far beyond traditional file sharing. This paper provides an overview of how P2P systems are capable of providing robust command and control for Distributed Multi-Agent Systems (DMASs). Specifically, this article presents the evolution of P2P architectures to date by discussing supporting technologies and applicability of each generation of P2P systems. It provides a detailed survey of fundamental design approaches found in modern large-scale P2P systems highlighting design considerations for building and deploying scalable P2P applications. The survey includes unstructured P2P systems, content retrieval systems, communications structured P2P systems, flat structured P2P systems and finally Hierarchical Peer-to-Peer (HP2P) overlays. It concludes with a presentation of design tradeoffs and opportunities for future research into P2P overlay systems

    RC-Chord: Resource Clustering in a Large-Scale Hierarchical Peer-to-Peer System

    Get PDF
    Conducting data fusion and Command and Control (C2) in large-scale systems requires more than the presently available Peer-to-Peer (P2P) technologies provide. Resource Clustered Chord (RC-Chord) is an extension to the Chord protocol that incorporates elements of a hierarchical peer-to-peer architecture to facilitate coalition formation algorithms in large-scale systems. Each cluster in this hierarchy represents a particular resource available for allocation, and RC-Chord provides the capabilities to locate agents of a particular resource. This approach improves upon other strategies by including support for abundant resources, or those resources that most or all agents in the system possess. This scenario exists in large-scale coalition formation problems, and applies directly to the United States Air Force\u27s CyberCraft project. Simulations demonstrate that RC-Chord scales to systems of one million or more agents, and can be adapted to serve as a deployment environment for CyberCraft

    Large-scale Cooperative Task Distribution on Peer-to-Peer Networks

    Get PDF
    Large-scale systems are part of a growing trend in distributed computing, and coordinating control of them is an increasing challenge. This paper presents a cooperative agent system that scales to one million or more nodes in which agents form coalitions to complete global task objectives. This approach uses the large-scale Command and Control (C2) capabilities of the Resource Clustered Chord (RC-Chord) Hierarchical Peer-to-Peer (HP2P) design. Tasks are submitted that require access to processing, data, or hardware resources, and a distributed agent search is performed to recruit agents to satisfy the distributed task. This approach differs from others by incorporating design elements to accommodate large-scale systems into the resource location algorithm. Peersim simulations demonstrate that the distributed coalition formation algorithm is as effective as an omnipotent central algorithm in a one million agent system

    Statin Discontinuation among Nursing Home Residents with Advanced Dementia

    Get PDF
    Background: Statin use in elderly individuals with life-limiting illness such as advanced dementia is controversial. Objective: To describe factors associated with statin discontinuation and estimate impact of discontinuation on 28-day hospitalizations in nursing home (NH) residents with advanced dementia. Methods: Retrospective cohort study of NH residents ≥ 65 years with recent progression to advanced dementia from 5 large U.S. states drawn from the 2007-2008 Minimum Data Set 2.0. We identified residents using statins. Clinical characteristics and 28-day hospitalization risk were compared for residents discontinuing and continuing statins. Multivariable Cox proportional hazard models identified factors associated with time to statin discontinuation and time to hospitalization. Sensitivity analysis using self-controlled case series examined the role of confounding-by-indication on risk estimation from the cohort approach. Results: Of 10,212 residents with decline to advanced dementia, 16.6% were prescribed statins (n=1,699). Statin users had mean age of 83.1 yrs, 68.9% were female, and mean medication burden was 10.3 (SD 4.8, range 1-31). Over one-third (n=632) discontinued in follow-up. Median time to discontinuation was 36 days after decline to advanced dementia (IQR [25%, 75%]: 12 days, 110 days). After adjustment, factors independently associated with increased hazard of discontinuation included residence in a NH in Florida relative to California, hospitalization in the 30 days prior to decline to advanced dementia, greater medication burden, and having cancer. The 28-day hospitalization risk was higher for residents discontinuing statins compared to continuing (adjusted hazard ratio = 1.78, CI 1.61,2.58). The SCCS estimate for 28-day hospitalization risk following statin discontinuation compared to a 28-day pre-discontinuation control period was lower than the cohort estimate (IRR= 0.79, CI 0.76, 0.83). Conclusion: A significant proportion of nursing home residents with dementia who use statins when they progress to advanced stage disease discontinue use. Hospitalization outcomes following discontinuation differ depending on method of estimation
    • …
    corecore