71,294 research outputs found
Sequential Design for Ranking Response Surfaces
We propose and analyze sequential design methods for the problem of ranking
several response surfaces. Namely, given response surfaces over a
continuous input space , the aim is to efficiently find the index of
the minimal response across the entire . The response surfaces are not
known and have to be noisily sampled one-at-a-time. This setting is motivated
by stochastic control applications and requires joint experimental design both
in space and response-index dimensions. To generate sequential design
heuristics we investigate stepwise uncertainty reduction approaches, as well as
sampling based on posterior classification complexity. We also make connections
between our continuous-input formulation and the discrete framework of pure
regret in multi-armed bandits. To model the response surfaces we utilize
kriging surrogates. Several numerical examples using both synthetic data and an
epidemics control problem are provided to illustrate our approach and the
efficacy of respective adaptive designs.Comment: 26 pages, 7 figures (updated several sections and figures
An adsorbed gas estimation model for shale gas reservoirs via statistical learning
Shale gas plays an important role in reducing pollution and adjusting the
structure of world energy. Gas content estimation is particularly significant
in shale gas resource evaluation. There exist various estimation methods, such
as first principle methods and empirical models. However, resource evaluation
presents many challenges, especially the insufficient accuracy of existing
models and the high cost resulting from time-consuming adsorption experiments.
In this research, a low-cost and high-accuracy model based on geological
parameters is constructed through statistical learning methods to estimate
adsorbed shale gas conten
Uncertainty quantification of coal seam gas production prediction using Polynomial Chaos
A surrogate model approximates a computationally expensive solver. Polynomial
Chaos is a method to construct surrogate models by summing combinations of
carefully chosen polynomials. The polynomials are chosen to respect the
probability distributions of the uncertain input variables (parameters); this
allows for both uncertainty quantification and global sensitivity analysis.
In this paper we apply these techniques to a commercial solver for the
estimation of peak gas rate and cumulative gas extraction from a coal seam gas
well. The polynomial expansion is shown to honour the underlying geophysics
with low error when compared to a much more complex and computationally slower
commercial solver. We make use of advanced numerical integration techniques to
achieve this accuracy using relatively small amounts of training data
Alexandria: Extensible Framework for Rapid Exploration of Social Media
The Alexandria system under development at IBM Research provides an
extensible framework and platform for supporting a variety of big-data
analytics and visualizations. The system is currently focused on enabling rapid
exploration of text-based social media data. The system provides tools to help
with constructing "domain models" (i.e., families of keywords and extractors to
enable focus on tweets and other social media documents relevant to a project),
to rapidly extract and segment the relevant social media and its authors, to
apply further analytics (such as finding trends and anomalous terms), and
visualizing the results. The system architecture is centered around a variety
of REST-based service APIs to enable flexible orchestration of the system
capabilities; these are especially useful to support knowledge-worker driven
iterative exploration of social phenomena. The architecture also enables rapid
integration of Alexandria capabilities with other social media analytics
system, as has been demonstrated through an integration with IBM Research's
SystemG. This paper describes a prototypical usage scenario for Alexandria,
along with the architecture and key underlying analytics.Comment: 8 page
Non-functional requirements: size measurement and testing with COSMIC-FFP
The non-functional requirements (NFRs) of software systems are well known to add a degree of uncertainty to process of estimating the cost of any project. This paper contributes to the achievement of more precise project size measurement through incorporating NFRs into the functional size quantification process. We report on an initial solution proposed to deal with the problem of quantitatively assessing the NFR modeling process early in the project, and of generating test cases for NFR verification purposes. The NFR framework has been chosen for the integration of NFRs into the requirements modeling process and for their quantitative assessment. Our proposal is based on the functional size measurement method, COSMIC-FFP, adopted in 2003 as the ISO/IEC 19761 standard. Also in this paper, we extend the use of COSMIC-FFP for NFR testing purposes. This is an essential step for improving NFR development and testing effort estimates, and consequently for managing the scope of NFRs. We discuss the merits of the proposed approach and the open questions related to its design
- …