486 research outputs found
Some Notes about Inference for the Lognormal Diffusion Process with Exogenous Factors
Different versions of the lognormal diffusion process with exogenous factors have been
used in recent years to model and study the behavior of phenomena following a given growth curve.
In each case considered, the estimation of the model has been addressed, generally by maximum
likelihood (ML), as has been the study of several characteristics associated with the type of curve
considered. For this process, a unified version of the ML estimation problem is presented, including
how to obtain estimation errors and asymptotic confidence intervals for parametric functions when no
explicit expression is available for the estimators of the parameters of the model. The Gompertz-type
diffusion process is used here to illustrate the application of the methodology.This work was supported in part by the Ministerio de Economía, Industria y Competitividad,
Spain, under Grants MTM2014-58061-P and MTM2017-85568-P
T-Growth Stochastic Model: Simulation and Inference via Metaheuristic Algorithms
The main objective of this work is to introduce a stochastic model associated with
the one described by the T-growth curve, which is in turn a modification of the logistic curve.
By conveniently reformulating the T curve, it may be obtained as a solution to a linear differential
equation. This greatly simplifies the mathematical treatment of the model and allows a diffusion
process to be defined, which is derived from the non-homogeneous lognormal diffusion process,
whose mean function is a T curve. This allows the phenomenon under study to be viewed in a
dynamic way. In these pages, the distribution of the process is obtained, as are its main characteristics.
The maximum likelihood estimation procedure is carried out by optimization via metaheuristic
algorithms. Thanks to an exhaustive study of the curve, a strategy is obtained to bound the parametric
space, which is a requirement for the application of various swarm-based metaheuristic algorithms.
A simulation study is presented to show the validity of the bounding procedure and an example
based on real data is provided.Ministerio de Economía, Industria y Competitividad, Spain, under Grant MTM2017-85568-PFEDER/Junta de Andalucía-Consejería de Economía
y Conocimiento, Spain, Grant A-FQM-456-UGR1
Statistical analysis and first-passage-time applications of a lognormal diffusion process with multi-sigmoidal logistic mean
We consider a lognormal diffusion process having a multisigmoidal logistic mean,
useful to model the evolution of a population which reaches the maximum level of
the growth after many stages. Referring to the problem of statistical inference, two
procedures to find the maximum likelihood estimates of the unknown parameters
are described. One is based on the resolution of the system of the critical points
of the likelihood function, and the other is on the maximization of the likelihood
function with the simulated annealing algorithm. A simulation study to validate the
described strategies for finding the estimates is also presented, with a real application
to epidemiological data. Special attention is also devoted to the first-passage-time
problem of the considered diffusion process through a fixed boundary.Universita degli Studi di Salerno within the CRUI-CARE Agreemen
Hyperbolastic type-III diffusion process: Obtaining from the generalized Weibull diffusion process
The modeling of growth phenomena has become a matter of great interest in many different
fields of application and research. New stochastic models have been developed, and others have been
updated to this end. The present paper introduces a diffusion process whose main characteristic is that
its mean function belongs to a wide family of curves derived from the classic Weibull curve. The main
characteristics of the process are described and, as a particular case, a di usion process is considered
whose mean function is the hyperbolastic curve of type III, which has proven useful in the study of cell
growth phenomena. By studying its estimation we are able to describe the behavior of such growth
patterns. This work considers the problem of the maximum likelihood estimation of the parameters
of the process, including strategies to obtain initial solutions for the system of equations that must be
solved. Some examples are provided based on simulated sample paths and real data to illustrate the
development carried out.This work was supported in part by the Ministerio de Economía, Industria y Competitividad, Spain,
under Grant MTM2017-85568-P
Simulated Annealing
The book contains 15 chapters presenting recent contributions of top researchers working with Simulated Annealing (SA). Although it represents a small sample of the research activity on SA, the book will certainly serve as a valuable tool for researchers interested in getting involved in this multidisciplinary field. In fact, one of the salient features is that the book is highly multidisciplinary in terms of application areas since it assembles experts from the fields of Biology, Telecommunications, Geology, Electronics and Medicine
Recommended from our members
Heuristic solution techniques for a spatial harvest scheduling problem involving wildlife habitat and timber income
Three heuristic techniques: simulated annealing (SA), tabu search (TS), and tabu search with strategic oscillation (TSSO), were used to schedule silvicultural activities designed to accelerate development of older forest structure at both stand and landscape scales over a 2450 acre forest located in northwestern Oregon. Goals for the forest over a 100-year planning horizon included reaching at least 500 acres of older forest structure with at least one contiguous 200-acre (or larger) block as soon as possible. The configuration and location, but not the amounts, of the older forest structure acres and the contiguous block were then free to move about the forest through time while best meeting the goal of producing a high, steady revenue flow over the entire planning horizon subject to restrictions on maximum clearcut patch size. The heuristic techniques were able to provide feasible tactical schedules fulfilling the strategic goals over the entire horizon in ways which traditional forest planning tools cannot. Of the three techniques examined, TSSO produced schedules with the best, most consistent objective function values. SA yielded a wider range of values which were always slightly worse but required only a fraction of the computing time. Straightforward TS produced relatively poor objective function values, most likely because of its inability to search the infeasible regions of the diverse solution space. Estimation of the globally optimal objective function value using Weibull distributions suggested that all TSSO solutions were within 1.8% of the optimum, the best being within .03%, while all SA solutions were within 7.6%, the best being within 1 .7%. However, 95% confidence intervals of the Weibull location parameter estimates for the SA and TSSO distributions did not overlap, despite the fact that both distributions of results failed to be rejected as fitting a Weibull distribution. This disparity again suggests that statistical inference by itself of global optima for heuristic results may be an inadequate means of assessing how "good" a heuristic is
Optimization of highway work zone decisions considering Short-term and Long-term Impacts
With the increase of the number, duration, and scope of maintenance projects on the national highway system, transportation agencies face great challenges in developing effective comprehensive work zone management plans which minimize the negative impacts on road users and workers. The types of maintenance operation, timing, duration, configuration, and user impact mitigation strategies are major considerations in developing work zone management plans. Some of those decisions may not only affect road users during the maintenance phase but also have significant impacts on pavement serviceability in future years.
This dissertation proposes a systematic methodology for jointly optimizing critical work zone decisions, based on analytical and simulation models developed to estimate short-term impacts during the maintenance periods and long-term impacts over the pavement life cycle.
The dissertation starts by modeling the effects of different work zone decisions on agency and user costs during the maintenance phase. An analytic one-time work zone cost model is then formulated based on simulation analysis results. Next, a short-term work zone decision optimization model is developed to find the best combination of lane closure and traffic control strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature of this optimization problem, a heuristic optimization algorithm, named two-stage modified population-based simulated annealing (2PBSA), is designed to search for a near-optimal solution. For those maintenance projects that may need more detailed estimation of user delay or other impacts, a simulation-based optimization method is proposed in this study. Through a hybrid approach combining simulation and analytic methods along with parallel computing techniques, the proposed method can yield satisfactory solutions while reducing computational efforts to a more acceptable level. The last part of this study establishes a framework for jointly optimizing short-term and long-term work zone decisions with the objective of maximizing cost-effectiveness. Case studies are conducted to test the performance of the proposed methods and develop guidelines for development of work zone management plans
Upravljanje putanjama vazduhoplova u kontroli letenja na pre-taktičkom i taktičkom nivou
Global air traffic demand is continuously increasing, and it is predicted
to be tripled by 2050. The need for increasing air traffic capacity motivates a
shift of ATM towards Trajectory Based Operations (TBOs). This implies the
possibility to design efficient congestion-free aircraft trajectories more in
advance (pre-tactical, strategic level) reducing controller’s workload on tactical
level. As consequence, controllers will be able to manage more flights.
Current flow management practices in air traffic management (ATM)
system shows that under the present system settings there are only timid
demand management actions taken prior to the day of operation such as: slot
allocation and strategic flow rerouting. But the choice of air route for a
particular flight is seen as a commercial decision to be taken by airlines, given
air traffic control constraints. This thesis investigates the potential of robust
trajectory planning (considered as an additional demand management action)
at pre-tactical level as a mean to alleviate the en-route congestion in airspace.
Robust trajectory planning (RTP) involves generation of congestion-free
trajectories with minimum operating cost taking into account uncertainty of
trajectory prediction and unforeseen event. Although planned cost could be
higher than of conventional models, adding robustness to schedules might
reduce cost of disruptions and hopefully lead to reductions in operating cost.
The most of existing trajectory planning models consider finding of conflict-free
trajectories without taking into account uncertainty of trajectory prediction. It is
shown in the thesis that in the case of traffic disturbances, it is better to have a
robust solution otherwise newly generated congestion problems would be hard
and costly to solve.
This thesis introduces a novel approach for route generation (3D
trajectory) based on homotopic feature of continuous functions. It is shown that
this approach is capable of generating a large number of route shapes with a
reasonable number of decision variables. Those shapes are then coupled with
time dimension in order to create trajectories (4D)...Globalna potražnja za vazdušnim saobraćajem u stalnom je porastu i
prognozira se da će broj letova biti utrostručen do 2050 godine. Potreba za
povećanjem kapaciteta sistema vazdušnog saobraćaja motivisala je promene u
sistemu upravljanja saobraćajnim tokovima u kome će u budućnosti centralnu
ulogu imati putanje vazduhoplova tzv. “trajectory-based” koncept. Takav
sistem omogućiće planiranje putanja vazduhoplova koje ne stvaraju zagušenja
u sistemu na pre-taktičkom nivou i time smanjiti radno opterećenje kontrolora
na taktičkom nivou. Kao posledica, kontrolor će moći da upravlja više letova
nego u današnjem sistemu.
Današnja praksa upravljanja saobraćajnim tokovima pokazuje da se mali
broj upravljačkih akcija primenjuje pre dana obavljanja letova npr.: alokacija
slotova poletanja i strateško upravljanje saobraćajnim tokovima. Međutim izbor
putanje kojom će se odviti let posmatra se kao komercijalna odluka aviokompanije
(uz poštovanje postavljenih ograničenja od strane kontrole letenja) i
stoga je ostavljen na izbor avio-kompaniji. Većina, do danas razvijenih, modela
upravljanja putanjama vazduhoplova ima za cilj generisanje bez-konfliktnih
putanja, ne uzimajući u obzir neizvesnost u poziciji vazduhoplova. U ovoj
doktorskoj disertaciji ispitivano je planiranje robustnih putanja vazduhoplova
(RTP) na pre-taktičkom nivou kao sredstvo ublažavanja zagušenja u
vazdušnom prostoru . Robustno upravljanje putanjama vazduhoplova
podrazumeva izbor putanja vazduhoplova sa minimalnim operativnim
troškovima koje ne izazivaju zagušenja u vazdušnom prostoru u uslovima
neizvesnosti buduđe pozicije vazduhoplova i nepredviđenih događaja. Iako
predviđeni (planirani) operativni troškovi robustnih putanja mogu u startu biti
veći od operativnih troškova bez-konfliktnih putanja, robusnost može uticati na
smanjenje troškove poremećaja putanja jer ne zahteva dodatnu promenu
putanja vazduhplova radi izbegavanja konfliktnih situacija na taktičkom nivou.
To na kraju može dovesti i do smanjenja stvarnih operativnih troškova. U tezi je
pokazano, da je u slučaju poremećaja saobraćaja bolje imati robustno rešenje
(putanje), jer novo-nastali problem zagušenosti vazdušnog prostora je teško i
skupo rešiti..
Instantiated mixed effects modeling of Alzheimer's disease markers
The assessment and prediction of a subject's current and future risk of developing neurodegenerative diseases like Alzheimer's disease are of great interest in both the design of clinical trials as well as in clinical decision making. Exploring the longitudinal trajectory of markers related to neurodegeneration is an important task when selecting subjects for treatment in trials and the clinic, in the evaluation of early disease indicators and the monitoring of disease progression. Given that there is substantial intersubject variability, models that attempt to describe marker trajectories for a whole population will likely lack specificity for the representation of individual patients. Therefore, we argue here that individualized models provide a more accurate alternative that can be used for tasks such as population stratification and a subject-specific prognosis. In the work presented here, mixed effects modeling is used to derive global and individual marker trajectories for a training population. Test subject (new patient) specific models are then instantiated using a stratified “marker signature” that defines a subpopulation of similar cases within the training database. From this subpopulation, personalized models of the expected trajectory of several markers are subsequently estimated for unseen patients. These patient specific models of markers are shown to provide better predictions of time-to-conversion to Alzheimer's disease than population based models
Multivariate Models and Algorithms for Systems Biology
Rapid advances in high-throughput data acquisition technologies, such as microarraysand next-generation sequencing, have enabled the scientists to interrogate the expression levels of tens of thousands of genes simultaneously. However, challenges remain in developingeffective computational methods for analyzing data generated from such platforms. In thisdissertation, we address some of these challenges. We divide our work into two parts. Inthe first part, we present a suite of multivariate approaches for a reliable discovery of geneclusters, often interpreted as pathway components, from molecular profiling data with replicated measurements. We translate our goal into learning an optimal correlation structure from replicated complete and incomplete measurements. In the second part, we focus on thereconstruction of signal transduction mechanisms in the signaling pathway components. Wepropose gene set based approaches for inferring the structure of a signaling pathway.First, we present a constrained multivariate Gaussian model, referred to as the informed-case model, for estimating the correlation structure from replicated and complete molecular profiling data. Informed-case model generalizes previously known blind-case modelby accommodating prior knowledge of replication mechanisms. Second, we generalize theblind-case model by designing a two-component mixture model. Our idea is to strike anoptimal balance between a fully constrained correlation structure and an unconstrained one.Third, we develop an Expectation-Maximization algorithm to infer the underlying correlation structure from replicated molecular profiling data with missing (incomplete) measurements.We utilize our correlation estimators for clustering real-world replicated complete and incompletemolecular profiling data sets. The above three components constitute the first partof the dissertation. For the structural inference of signaling pathways, we hypothesize a directed signal pathway structure as an ensemble of overlapping and linear signal transduction events. We then propose two algorithms to reverse engineer the underlying signaling pathway structure using unordered gene sets corresponding to signal transduction events. Throughout we treat gene sets as variables and the associated gene orderings as random.The first algorithm has been developed under the Gibbs sampling framework and the secondalgorithm utilizes the framework of simulated annealing. Finally, we summarize our findingsand discuss possible future directions
- …