234 research outputs found
Percentile Queries in Multi-Dimensional Markov Decision Processes
Markov decision processes (MDPs) with multi-dimensional weights are useful to
analyze systems with multiple objectives that may be conflicting and require
the analysis of trade-offs. We study the complexity of percentile queries in
such MDPs and give algorithms to synthesize strategies that enforce such
constraints. Given a multi-dimensional weighted MDP and a quantitative payoff
function , thresholds (one per dimension), and probability thresholds
, we show how to compute a single strategy to enforce that for all
dimensions , the probability of outcomes satisfying is at least . We consider classical quantitative payoffs from
the literature (sup, inf, lim sup, lim inf, mean-payoff, truncated sum,
discounted sum). Our work extends to the quantitative case the multi-objective
model checking problem studied by Etessami et al. in unweighted MDPs.Comment: Extended version of CAV 2015 pape
A final report of research on stochastic and adaptive systems
Final report."March 1982."Bibliography: p. 26-31.Air Force Office of Scientific Research Grant AFOSR-77-3281Bby Michael Athans, Sanjoy K. Mitter, Lena Valavani
Scheduling for todayâs computer systems: bridging theory and practice
Scheduling is a fundamental technique for improving performance in computer systems. From web servers
to routers to operating systems, how the bottleneck device is scheduled has an enormous impact on the performance of the system as a whole. Given the immense literature studying scheduling, it is easy to think that we already understand enough about scheduling. But, modern computer system designs have highlighted a number of disconnects between traditional analytic results and the needs of system designers.
In particular, the idealized policies, metrics, and models used by analytic researchers do not match the policies, metrics, and scenarios that appear in real systems.
The goal of this thesis is to take a step towards modernizing the theory of scheduling in order to provide
results that apply to todayâs computer systems, and thus ease the burden on system designers. To accomplish
this goal, we provide new results that help to bridge each of the disconnects mentioned above. We will move beyond the study of idealized policies by introducing a new analytic framework where the focus is on scheduling heuristics and techniques rather than individual policies. By moving beyond the study of individual policies, our results apply to the complex hybrid policies that are often used in practice. For example, our results enable designers to understand how the policies that favor small job sizes are affected by the fact that real systems only have estimates of job sizes. In addition, we move beyond the study of mean response time
and provide results characterizing the distribution of response time and the fairness of scheduling policies.
These results allow us to understand how scheduling affects QoS guarantees and whether favoring small job sizes results in large job sizes being treated unfairly. Finally, we move beyond the simplified models traditionally used in scheduling research and provide results characterizing the effectiveness of scheduling in multiserver systems and when users are interactive. These results allow us to answer questions about the how to design multiserver systems and how to choose a workload generator when evaluating new scheduling designs
Advances in Trans-dimensional Geophysical Inference
This research presents a series of novel Bayesian
trans-dimensional
methods for geophysical inversion. A first example illustrates
how
Bayesian prior information obtained from theory and numerical
experiments can be used to better inform a difficult
multi-modal inversion of dispersion information from empirical
Greens
functions obtained from ambient noise cross-correlation. This
approach
is an extension of existing partition modeling schemes.
An entirely new class of trans-dimensional algorithm, called the
trans-dimensional tree method is introduced. This new method is
shown
to be more efficient at coupling to a forward model, more
efficient at
convergence, and more adaptable to different dimensions and
geometries
than existing approaches. The efficiency and flexibility of the
trans-dimensional tree method is demonstrated in two different
examples: (1) airborne electromagnetic tomography (AEM) in a 2D
transect inversion, and (2) a fully non-linear inversion of
ambient
noise tomography. In this latter example the resolution at depth
has
been significantly improved by inverting a contiguous band of
frequencies jointly rather than as independent phase velocity
maps,
allowing new insights into crustal architecture beneath Iceland.
In a first test case for even larger scale problems, an
application of
the trans-dimensional tree approach to large global data set is
presented. A global database of nearly 5 million multi-model
path
average Rayleigh wave phase velocity observations has been used
to
construct global phase velocity maps. Results are comparable to
existing published phase velocity maps, however, as the
trans-dimensional approach adapts the resolution appropriate to
the
data, rather than imposing damping or smoothing constraints to
stabilize the inversion, the recovered anomaly magnitudes are
generally higher with low uncertainties. While further
investigation is
needed, this early test case shows that trans-dimensional
sampling can
be applied to global scale seismology problems and that previous
analyses may, in some locales, under estimate the heterogeneity
of the
Earth.
Finally, in a further advancement of partition modelling with
variable
order polynomials, a new method has been developed called
trans-dimensional spectral elements. Previous applications
involving
variable order polynomials have used polynomials that are both
difficult to work with in a Bayesian framework and unstable at
higher orders. By using the orthogonal polynomials typically used
in
modern full-waveform solvers, the useful properties of this type
of
polynomial and its application in trans-dimensional inversion
are
demonstrated. Additionally, these polynomials can be directly
used in
complex differential solvers and an example of this for 1D
inversion
of surface wave dispersion curves is given
Recommended from our members
The entangled cyberspace: an integrated approach for predicting cyber-attacks
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonSignificant studies in cyber defence analysis have predominantly revolved around a single linear analysis of information from a single source of evidence (The Network). These studies were limited in their ability to understand the dynamics of entanglements related to cyber-incidents. This research integrates evidence beyond the network in an attempt to understand and predict phases of the kill-chain across the information space.
This research provides a multi-dimensional phased analysis of the traditional kill-chain model using structural vector autoregressive models. In the âEntangled Cyberspace Frameworkâ, each phase of the kill-chain corresponds to a single dimension of the information space based on time observations of certain events. Events are represented as time signals, where each phase is characterised by multiple time signals representing multiple events on that phase. Multiple time signals are analysed using structural models for multiple time series analysis (Vector Auto-Regressive models). At each phase of the kill-chain, we perform a lagged co-integration analysis of events across the information space. This nature of analysis detects hidden entanglements that characterise events in the kill-chain beyond the network. The measured prediction accuracy and error measured at each stage of the experiment represents the usefulness of selected events in characterising the defined stage of the kill-chain.
The entangled cyberspace, in theory, is the fusion of three conceptual foundations: a) A multi-dimensional characterisation of cyberspace, b) A sequential phased model for perpetrating cyber-attacks and c) A structural model for integrating and simultaneously analysing multiple sources of evidence. It starts with the characterisation of the information space into different dimensions of interest. The framework goes further to identify evidence sources across these characterised dimensions and integrates them in the analytical context under consideration (e.g. Malware Injection).
The concrete findings show that our approach and analytical methodology are capable of detecting entanglements when applied to a set of entangled activities across the information space. The findings also prove that activities beyond the network have significant effects on the nature of the unfolding cyber-attack vector. The predictive features of events across the kill-chain were also presented in this research as opinion and emotion drivers on the social dimension, packet data details and social and cultural events on the economic layer. Finally, co-integration detected between events across and within dimensions of the information space proves the existence of both inter-dimensional and intra-dimensional entanglements that affect the nature of events unfolding during the kill-chain (from the adversaryâs point of view).
The novelty of this research rests in the ability to hop across the information space for detecting evidential clues of activities that are related-to cyber-incidents. This research also expands the standard multi-dimensional information space to include SPEC factors as indicators of cyber-incidents. This research improves the current information security management model, specifically in the monitoring, analysis and detection phases. This research provides a methodology that accommodates a robust evidence base for understanding the attack surface. Practically, this research provides a basis for creating applications and tools for protecting critical national infrastructure by integrating data from social platforms, real-world political, cultural and economic events and the cyber-physical
Proceedings of the Fifth NASA/NSF/DOD Workshop on Aerospace Computational Control
The Fifth Annual Workshop on Aerospace Computational Control was one in a series of workshops sponsored by NASA, NSF, and the DOD. The purpose of these workshops is to address computational issues in the analysis, design, and testing of flexible multibody control systems for aerospace applications. The intention in holding these workshops is to bring together users, researchers, and developers of computational tools in aerospace systems (spacecraft, space robotics, aerospace transportation vehicles, etc.) for the purpose of exchanging ideas on the state of the art in computational tools and techniques
Stochastic stability research for complex power systems
Bibliography: p. 302-311."November 1980." "Midterm report ... ."U.S. Dept. of Energy Contract ET-76-A-01-2295Tobias A. Trygar
A Bayesian Approach to Parameter Inference in Queueing Networks
The application of queueing network models to real-world applications often involves the task of estimating the service demand placed by requests at queueing nodes. In this article, we propose a methodology to estimate service demands in closed multiclass queueing networks based on Gibbs sampling. Our methodology requires measurements of the number of jobs at resources and can accept prior probabilities on the demands. Gibbs sampling is challenging to apply to estimation problems for queueing networks since it requires one to efficiently evaluate a likelihood function on the measured data. This likelihood function depends on the equilibrium solution of the network, which is difficult to compute in closed models due to the presence of the normalizing constant of the equilibrium state probabilities. To tackle this obstacle, we define a novel iterative approximation of the normalizing constant and show the improved accuracy of this approach, compared to existing methods, for use in conjunction with Gibbs sampling. We also demonstrate that, as a demand estimation tool, Gibbs sampling outperforms other popular Markov Chain Monte Carlo approximations. Experimental validation based on traces from a cloud application demonstrates the effectiveness of Gibbs sampling for service demand estimation in real-world studies
Mapping boundaries of generative systems for design synthesis
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Architecture, 2007.Page 123 blank.Includes bibliographical references (p. 121-122).Architects have been experimenting with generative systems for design without a clear reference or theory of what, why or how to deal with such systems. In this thesis I argue for three points. The first is that generative systems in architecture are implemented at a skin-deep level as they are only used to synthesize form within confined domains. The second is that such systems can be only implemented if a design formalism is defined. The third is that generative systems can be deeper integrated within a design process if they were coupled with performance-based evaluation methods. These arguments are discussed in four chapters: 1- Introduction: a panoramic view of generative systems in architecture and in. computing mapping their occurrences and implementations. 2- Generative Systems for Design: highlights on integrating generative systems in architecture design processes; and discussions on six generative systems including: Algorithmic, Parametrics, L-systems, Cellular Automata, Fractals and Shape Grammars. 3- Provisional taxonomy: A summery table of systems properties and a classification of generative systems properties as discussed in the previous chapter 4- Conclusion: comments and explanations on why such systems are simplicity implemented within design.by Maher El-Khaldi.S.M
- âŠ