520 research outputs found
Modeling of Groundwater Flow and Radionuclide Transport at the Climax Mine sub-CAU, Nevada Test Site
Conflicting Objectives in Decisions
This book deals with quantitative approaches in making decisions when conflicting objectives are present. This problem is central to many applications of decision analysis, policy analysis, operational research, etc. in a wide range of fields, for example, business, economics, engineering, psychology, and planning. The book surveys different approaches to the same problem area and each approach is discussed in considerable detail so that the coverage of the book is both broad and deep. The problem of conflicting objectives is of paramount importance, both in planned and market economies, and this book represents a cross-cultural mixture of approaches from many countries to the same class of problem
Stochastic analysis of nonlinear dynamics and feedback control for gene regulatory networks with applications to synthetic biology
The focus of the thesis is the investigation of the generalized repressilator model
(repressing genes ordered in a ring structure). Using nonlinear bifurcation analysis
stable and quasi-stable periodic orbits in this genetic network are characterized
and a design for a switchable and controllable genetic oscillator is proposed. The
oscillator operates around a quasi-stable periodic orbit using the classical engineering
idea of read-out based control. Previous genetic oscillators have been
designed around stable periodic orbits, however we explore the possibility of
quasi-stable periodic orbit expecting better controllability.
The ring topology of the generalized repressilator model has spatio-temporal
symmetries that can be understood as propagating perturbations in discrete lattices.
Network topology is a universal cross-discipline transferable concept and
based on it analytical conditions for the emergence of stable and quasi-stable
periodic orbits are derived. Also the length and distribution of quasi-stable oscillations
are obtained. The findings suggest that long-lived transient dynamics
due to feedback loops can dominate gene network dynamics.
Taking the stochastic nature of gene expression into account a master equation
for the generalized repressilator is derived. The stochasticity is shown to influence
the onset of bifurcations and quality of oscillations. Internal noise is shown to
have an overall stabilizing effect on the oscillating transients emerging from the
quasi-stable periodic orbits.
The insights from the read-out based control scheme for the genetic oscillator
lead us to the idea to implement an algorithmic controller, which would direct
any genetic circuit to a desired state. The algorithm operates model-free, i.e. in
principle it is applicable to any genetic network and the input information is a
data matrix of measured time series from the network dynamics. The application
areas for readout-based control in genetic networks range from classical tissue
engineering to stem cells specification, whenever a quantitatively and temporarily
targeted intervention is required
Multiscale mathematical models for simulation and scale-up of green processes in the perspective of industrial sustainability
The present work presents research studies aimed at developing tools useful to design
engineering solutions moving in the direction of industrial sustainability. The investigations
hereinafter discussed regard an extraction process of active compounds \u2013 polyphenols \u2013 from
agro-food industry wastes (olive and grape pomaces) and a biorefinery exploiting waste
frying oil, solid organic wastes and algal biomass to produce biofuels. In particular, for the
former topic, a procedure aimed at the evaluation of the technological feasibility at pilot scale
of said process is discussed. The proposed approach takes into consideration the extended
kinetic route coupled with mathematical simulation. Detailed physically-based dynamic
mathematical models, taking into account mass and energy balance equations, are adopted
to describe both the lab-scale and the pilot-scale reactors. Chemical physical parameters
appearing in the models are estimated from the experimental data at lab-scale or are partially
taken from literature. Different heating systems are designed for the pilot scale reactor and
their performance is tested by simulation. Characteristic times are evaluated also during
start-ups and different control loops are analyzed in order to set-up the best process and
operating variables. Average yields in polyphenols are finally evaluated for both the batch
and the continuous operated pilot reactor, by considering feed variability and fluctuations of
process parameters.
For what concerns the biorefinery, special attention was devoted to the modeling of the
airlift reactor, its most delicate and complex component. In fact, to optimize this interesting
microalgae cultivation system, a precise description of the moving interfaces formed by the
liquid and gas phase is critical. In this study, coupled front capturing methods (standard and
conservative level set methods) and finite difference method are used to simulate gas bubbles
dynamics in a pilot-scale external loop air-lift photobioreactor in which microalgae are used
to capture CO2 from flue gas and to treat wastewater. Numerical simulations are carried out
on rectangular domains representing different sections of the vertical axis of the riser. The
data employed was either acquired from previous experimental campaigns carried out in the
airlift reactor or found in the literature. The rise, shape dynamics and coalescence process of
the bubbles of flue gas are studied.
Moreover, for each analyzed applications, a procedure based on Buckingham \u3c0-theorem
to perform a rigorous scale-up is proposed. In this way, scale-invariant dimensionless
groups describing and summarizing the considered processes could be identified. For the
research focused on the scale-up of photobioreactors used to cultivate Chlorella Vulgaris,
an experimental campaign at three levels was designed and carried out to evaluate the
characteristic dimensionless numbers individuated by the theoretical formulation. Since
scale-up regards both geometrical dimensions and type of reactor, passing from lab-scale
stirred tanks to pilot scale tubular and airlift, particular attention was devoted to define
characteristic lengths inside the dimensionless numbers
Essays on Cross-Sectionally Dependent Panel Data with an Application to Fiscal Policy in the European Monetary Union
In the past two decades, macroeconomists have used panel data to study the merits of fiscal policy for economic stabilisation. The datasets considered in these studies typically consist of a small number of time series corresponding to countries. This configuration does not match with the archetypical survey-style panel ataset for which a large literature concerning estimation and hypothesis testing exists. This PhD develops an estimation methodology that is catered towards macroeconomists: in four self-contained chapters, we develop a methodology for the estimation of dynamic models in the small N, large T framework in the presence of cross-sectional dependence in the error term.
In the first chapter we examine the effect of factors on the point estimates of several commonly-used estimators in the empirical literature and we find that these estimators are inconsistent. We also propose an estimator that is consistent for the parameters for of the model studied in that chapter.
In the second chapter we develop consistent quasi-difference GMM estimators and inferential procedures
for the small N, large T dynamic panel data model with factor error structures. We also prove consistency
and mixed-normality of the estimator when the number of factors is over-estimated.
In the third chapter we consider the large N, large T framework and show the first eigenvalues of the
covariance matrix of an approximate factor model are dominated by the factors whereas the remainder is
controlled by the residual noise. We show that this result is the basis for any consistent inferential procedure about R and continues to hold when R grows large, when the factors are weak and, importantly, in the large N, large T interactive fixed effects model.
In the fourth chapter we study fiscal policy using the methods developed in the thesis. We estimate
vector autoregressions from European countries and restrict the impulse-response functions to adhere to the
Stability and Growth Pact. We find that this one-size-fits-all approach is not appropriate for stabilization of
the European economy
Advances in approximate Bayesian computation and trans-dimensional sampling methodology
Bayesian statistical models continue to grow in complexity, driven
in part by a few key factors: the massive computational resources
now available to statisticians; the substantial gains made in
sampling methodology and algorithms such as Markov chain
Monte Carlo (MCMC), trans-dimensional MCMC (TDMCMC), sequential
Monte Carlo (SMC), adaptive algorithms and stochastic
approximation methods and approximate Bayesian computation (ABC);
and development of more realistic models for real world phenomena
as demonstrated in this thesis for financial models and
telecommunications engineering. Sophisticated statistical models
are increasingly proposed for practical solutions to real world problems in order to better capture salient features of
increasingly more complex data. With sophistication comes a
parallel requirement for more advanced and automated statistical
computational methodologies.
The key focus of this thesis revolves around innovation related to
the following three significant Bayesian research questions.
1. How can one develop practically useful Bayesian models and corresponding computationally efficient sampling methodology, when the likelihood model is intractable?
2. How can one develop methodology in order to automate Markov chain Monte Carlo sampling approaches to efficiently explore the support of a posterior distribution, defined across multiple Bayesian statistical models?
3. How can these sophisticated Bayesian modelling frameworks and sampling methodologies be utilized to solve practically relevant and important problems in the research fields of financial risk modeling and telecommunications engineering ?
This thesis is split into three bodies of work represented in
three parts. Each part contains journal papers with novel
statistical model and sampling methodological development. The
coherent link between each part involves the novel
sampling methodologies developed in Part I and utilized in Part II and Part III. Papers contained in
each part make progress at addressing the core research
questions posed.
Part I of this thesis presents generally applicable key
statistical sampling methodologies that will be utilized and
extended in the subsequent two parts. In particular it presents
novel developments in statistical methodology pertaining to
likelihood-free or ABC and TDMCMC methodology.
The TDMCMC methodology focuses on several aspects of automation
in the between model proposal construction, including
approximation of the optimal between model proposal kernel via a
conditional path sampling density estimator. Then this methodology
is explored for several novel Bayesian model selection
applications including cointegrated vector autoregressions (CVAR)
models and mixture models in which there is an unknown number of
mixture components. The second area relates to development of
ABC methodology with particular focus
on SMC Samplers methodology in an ABC context via Partial
Rejection Control (PRC). In addition to novel algorithmic
development, key theoretical properties are also studied for the
classes of algorithms developed. Then this methodology is
developed for a highly challenging practically significant
application relating to multivariate Bayesian -stable
models.
Then Part II focuses on novel statistical model development
in the areas of financial risk and non-life insurance claims
reserving. In each of the papers in this part the focus is on
two aspects: foremost the development of novel statistical models
to improve the modeling of risk and insurance; and then the
associated problem of how to fit and sample from such statistical
models efficiently. In particular novel statistical models are
developed for Operational Risk (OpRisk) under a Loss Distributional
Approach (LDA) and for claims reserving in Actuarial non-life
insurance modelling. In each case the models developed include an
additional level of complexity which adds flexibility to the model
in order to better capture salient features observed in real data.
The consequence of the additional complexity comes at the cost
that standard fitting and sampling methodologies are generally not
applicable, as a result one is required to develop and apply the
methodology from Part I.
Part III focuses on novel statistical model development
in the area of statistical signal processing for wireless
communications engineering. Statistical models will be developed
or extended for two general classes of wireless communications
problem: the first relates to detection of transmitted symbols and
joint channel estimation in Multiple Input Multiple Output (MIMO)
systems coupled with Orthogonal Frequency Division Multiplexing
(OFDM); the second relates to co-operative wireless communications
relay systems in which the key focus is on detection of
transmitted symbols. Both these areas will require advanced
sampling methodology developed in Part I to find solutions to
these real world engineering problems
- …