1,253 research outputs found

    Hyper-Spectral Image Analysis with Partially-Latent Regression and Spatial Markov Dependencies

    Get PDF
    Hyper-spectral data can be analyzed to recover physical properties at large planetary scales. This involves resolving inverse problems which can be addressed within machine learning, with the advantage that, once a relationship between physical parameters and spectra has been established in a data-driven fashion, the learned relationship can be used to estimate physical parameters for new hyper-spectral observations. Within this framework, we propose a spatially-constrained and partially-latent regression method which maps high-dimensional inputs (hyper-spectral images) onto low-dimensional responses (physical parameters such as the local chemical composition of the soil). The proposed regression model comprises two key features. Firstly, it combines a Gaussian mixture of locally-linear mappings (GLLiM) with a partially-latent response model. While the former makes high-dimensional regression tractable, the latter enables to deal with physical parameters that cannot be observed or, more generally, with data contaminated by experimental artifacts that cannot be explained with noise models. Secondly, spatial constraints are introduced in the model through a Markov random field (MRF) prior which provides a spatial structure to the Gaussian-mixture hidden variables. Experiments conducted on a database composed of remotely sensed observations collected from the Mars planet by the Mars Express orbiter demonstrate the effectiveness of the proposed model.Comment: 12 pages, 4 figures, 3 table

    A Capability-Centric Approach to Cyber Risk Assessment and Mitigation

    Get PDF
    Cyber-enabled systems are increasingly ubiquitous and interconnected, showing up in traditional enterprise settings as well as increasingly diverse contexts, including critical infrastructure, avionics, cars, smartphones, home automation, and medical devices. Meanwhile, the impact of cyber attacks against these systems on our missions, business objectives, and personal lives has never been greater. Despite these stakes, the analysis of cyber risk and mitigations to that risk tends to be a subjective, labor-intensive, and costly endeavor, with results that can be as suspect as they are perishable. We identified the following gaps in those risk results: concerns for (1) their repeatability/reproducibility, (2) the time required to obtain them, and (3) the completeness of the analysis per the degree of attack surface coverage. In this dissertation, we consider whether it is possible to make progress in addressing these gaps with the introduction of a new artifact called “BluGen.” BluGen is an automated platform for cyber risk assessment that employs a set of new risk analytics together with a highly-structured underlying cyber knowledge management repository. To help evaluate the hypotheses tied to the gaps identified, we conducted a study comparing BluGen to a cyber risk assessment methodology called EVRA. EVRA is representative of current practice and has been applied extensively over the past eight years to both fielded systems and systems under design. We used Design Science principles in the construction and investigation of BluGen, during which we considered each of the three gaps. The results of our investigation found support for the hypotheses tied to the gaps that BluGen is designed to address. Specifically, BluGen helps address the first gap by virtue of its methods/analytics executing as deterministic, automated processes. In the same way, BluGen helps address the second gap by producing its results at machine speeds in no worse than quadratic time complexity, seconds in this case. This result compares to the 25 hours that the EVRA team required to perform the same analysis. BluGen helps to address the third gap via its use of an underlying knowledge repository of cyber-related threats, mappings of those threats to cyber assets, and mappings of mitigations to the threats. The results show that manual analysis using EVRA covered about 12% of the attack surface considered by BluGen

    07101 Abstracts Collection -- Quantitative Aspects of Embedded Systems

    Get PDF
    From March 5 to March 9, 2007, the Dagstuhl Seminar 07101 ``Quantitative Aspects of Embedded Systems\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    A Comparison between Fixed-Basis and Variable-Basis Schemes for Function Approximation and Functional Optimization

    Get PDF
    Fixed-basis and variable-basis approximation schemes are compared for the problems of function approximation and functional optimization (also known as infinite programming). Classes of problems are investigated for which variable-basis schemes with sigmoidal computational units perform better than fixed-basis ones, in terms of the minimum number of computational units needed to achieve a desired error in function approximation or approximate optimization. Previously known bounds on the accuracy are extended, with better rates, to families o

    A Multiobjective Approach Applied to the Protein Structure Prediction Problem

    Get PDF
    Interest in discovering a methodology for solving the Protein Structure Prediction problem extends into many fields of study including biochemistry, medicine, biology, and numerous engineering and science disciplines. Experimental approaches, such as, x-ray crystallographic studies or solution Nuclear Magnetic Resonance Spectroscopy, to mathematical modeling, such as minimum energy models are used to solve this problem. Recently, Evolutionary Algorithm studies at the Air Force Institute of Technology include the following: Simple Genetic Algorithm (GA), messy GA, fast messy GA, and Linkage Learning GA, as approaches for potential protein energy minimization. Prepackaged software like GENOCOP, GENESIS, and mGA are in use to facilitate experimentation of these techniques. In addition to this software, a parallelized version of the fmGA, the so-called parallel fast messy GA, is found to be good at finding semi-optimal answers in reasonable wall clock time. The aim of this work is to apply a Multiobjective approach to solving this problem using a modified fast messy GA. By dividing the CHARMm energy model into separate objectives, it should be possible to find structural configurations of a protein that yield lower energy values and ultimately more correct conformations

    Numerical aerodynamic simulation facility feasibility study

    Get PDF
    There were three major issues examined in the feasibility study. First, the ability of the proposed system architecture to support the anticipated workload was evaluated. Second, the throughput of the computational engine (the flow model processor) was studied using real application programs. Third, the availability reliability, and maintainability of the system were modeled. The evaluations were based on the baseline systems. The results show that the implementation of the Numerical Aerodynamic Simulation Facility, in the form considered, would indeed be a feasible project with an acceptable level of risk. The technology required (both hardware and software) either already exists or, in the case of a few parts, is expected to be announced this year. Facets of the work described include the hardware configuration, software, user language, and fault tolerance
    corecore