2,085 research outputs found

    Dynamic probabilistic constraints under continuous random distributions

    Get PDF
    The paper investigates analytical properties of dynamic probabilistic constraints (chance constraints). The underlying random distribution is supposed to be continuous. In the first part, a general multistage model with decision rules depending on past observations of the random process is analyzed. Basic properties like (weak sequential) (semi-) continuity of the probability function or existence of solutions are studied. It turns out that the results differ significantly according to whether decision rules are embedded into Lebesgue or Sobolev spaces. In the second part, the simplest meaningful two-stage model with decision rules from L 2 is investigated. More specific properties like Lipschitz continuity and differentiability of the probability function are considered. Explicitly verifiable conditions for these properties are provided along with explicit gradient formulae in the Gaussian case. The application of such formulae in the context of necessary optimality conditions is discussed and a concrete identification of solutions presented

    Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Full text link

    Optimazation of marine sediments characterization via statistical analysis

    Get PDF
    The task of geotechnical site characterization includes defining the layout of ground units and establishing their relevant engineering properties. This is an activity in which uncertainties of different nature (inherent, experimental, of interpretation…) are always present and in which the amount and characteristics of available data are highly variable. Probabilistic methodologies are applied to assess and manage uncertainties. A Bayesian perspective of probability, that roots probability on belief, is well suited for geotechnical characterization problems, as it has flexibility to handle different kind of uncertainties and highly variable datasets –in quality and quantity. This thesis addresses different topics of geotechnical site characterization from a probabilistic perspective, with emphasis on offshore investigation, on the Cone Penetration Test (CPTu) and on Bayesian methodologies.The first topic addresses soil layer delineation based on CPT(u) data. The starting point is the recognition that layer delineation is problem-oriented and has a strong subjective component. We propose a novel CPTu record analysis methodology which aims to: a) elicit the heuristics that intervene in layer delineation, facilitating communication and coherence in interpretation b) facilitate probabilistic characterization of the identified layers c) is simple and intuitive to use. The method is based on sequential distribution fitting in conventionally accepted classification spaces (Soil Behavior Type charts). The proposed technique is applied at different sites, illustrating how it can be related to borehole observations, how it compares with alternative methodologies and how it can be extended to create cross-site profiles. The second topic addresses strain-rate corrections of dynamic CPTu data. Dynamic CPTu impact on the seafloor and are very agile characterization tools. However, they require transformation to equivalent quasi-static results that can be conventionally interpreted. Up to now the necessary corrections are either too vague or require the acquisition of paired dynamic and quasi-static CPTu records (i.e., same location’s acquisition). A Bayesian methodology is applied to derive strain-rate coefficients in a more general setting, one in which some quasi-static CPTu records are available in the study area, but they need not be paired to any converted dynamic CPTu. Application to a case study offshore Nice shows that the results match those obtained using paired tests. Furthermore, strain rate correction coefficients and transformed quasi-static profiles are expressed in probabilistic terms.The third topic addressed is the optimization of soil unit weight prediction from CPTu readings. A Bayesian Mixture Analysis is applied to a global database to identify hidden soil classes within it. The goal is to improve the accuracy of regressions between geotechnical parameters obtained by exploiting the database. The method is applied to predict soil unit weight from CPTu data, a problem that has intrinsic practical interest but it is also representative of difficulties faced by a larger class of problems in geotechnical regression. Results highlight a decrease of systematic transformation uncertainty and an improve of accuracy of soil unit weight prediction from CPTu at new sites. In a final application we present a probabilistic earthquake-induced landslide susceptibility map of the South-West Iberian margin. A simplified geotechnical pixel-based slope stability model is considered to whole study area within which the key stability model parameters are treated as random variables. Site characterization at the regional scale combines a global database with available geotechnical data through a Bayesian scheme. Outputs (landslide susceptibility maps) are derived from a reliability-based design procedure (Montecarlo simulations) providing a robust landslide susceptibility prediction at the site according to Receiver Operating Curve (ROC).La caracterización geotécnica de un emplazamiento incluye la definición de la disposición de las unidades de suelo y el establecimiento de sus propiedades de ingeniería relevantes. Es una actividad en la que siempre están presentes incertidumbres y en la que la cantidad y las caracteristicas de los datos disponibles son muy variables. Para evaluar y gestionar las incertidumbres se aplican metodologías probabilísticas. Una perspectiva bayesiana de la probabilidad es muy adecuada para la caracterización geotécnica, ya que tiene flexibilidad para manejar incertidumbres y datos muy variables. Esta tesis aborda diferentes temas de caracterización geotécnica desde una perspectiva probabilística, con énfasis en la investigación en alta mar, en el ensayo de penetración de cono (CPTu) y en las metodologías bayesianas El primer tema aborda la delineación de la capa de suelo basada en los datos CPT(u). El punto de partida es el reconocimiento de que la delineación de capas tiene un fuerte componente subjetivo. Proponemos una novedosa metodología de análisis de registros CPTu que tiene como objetivo: a) expresar la heurística que interviene en la delineación de capas, facilitando la comunicación en la su interpretación b) facilitar la caracterización probabilística de las capas identificadas c) uso sencillo e intuitivo. El método se basa en el ajuste de distribuciones secuenciales en espacios de clasificación (tablas de comportamiento del suelo). La técnica propuesta se aplica en diferentes emplazamientos, ilustrando cómo puede relacionarse con sondeos, cómo se compara con metodologías alternativas y cómo puede ampliarse para crear perfiles entre emplazamientos. El segundo tema aborda las correcciones de la velocidad de deformación de los datos del CPTu dinámico (que impactan en el fondo marino y son herramientas de caracterización muy ágiles). Sin embargo, requieren una transformación a resultados equivalentes que puedan ser interpretados convencionalmente. Hasta ahora las correcciones necesarias son vagas o requieren la adquisición de CPTu dinámicos y cuasi-estáticos emparejados. Se aplica una metodologia bayesiana para derivar los coeficientes de velocidad de deformación en un entorno más general, en el que se dispone de algunos registros de CPTu cuasi­estáticos en la zona de estudio, pero no es necesario emparejarlos con ningún CPTu dinámico convertido. La aplicación a un estudio de caso en el mar de Niza muestra que los resultados coinciden con los obtenidos mediante pruebas emparejadas. El tercer tema abordado es la optimización de la predicción del peso unitario del suelo a partir de las lecturas del CPTu. Se aplica un análisis de mezclas bayesiano a una base de datos global para identificar las clases de suelo ocultas en ella. El objetivo es mejorar la precisión de las regresiones entre los parámetros geotécnicos obtenidos explotando la base de datos. El método se aplica a la predicción del peso unitario del suelo a partir de los datos del CPTu. Los resultados destacan una disminución de la incertidumbre sistemática de la transformación y una mejora de la precisión de la predicción del peso unitario del suelo a partir de CPTu en nuevos sitios. En una aplicación final presentamos un mapa probabilistico de susceptibilidad a los deslizamientos de tierra inducidos por terremotos en el margen suroeste de la Península Ibérica. Se considera un modelo geotécnico simplificado de estabilidad de laderas basado en píxeles para toda el área de estudio, dentro del cual los parámetros clave del modelo de estabilidad se tratan como variables aleatorias. La caracterización a escala regional combina una base de datos global con los datos geotécnicos disponibles mediante un esquema bayesiano. Mapas de susceptibilidad a los corrimientos de tierra se derivan de un procedimiento de diseño basado en la fiabilidad que proporciona una predicción robusta de la susceptibilidad a deslizamientos de tierra en el sitio de acuerdo con la curva operativa del receptor (ROC).Postprint (published version

    EUROPEAN CONFERENCE ON QUEUEING THEORY 2016

    Get PDF
    International audienceThis booklet contains the proceedings of the second European Conference in Queueing Theory (ECQT) that was held from the 18th to the 20th of July 2016 at the engineering school ENSEEIHT, Toulouse, France. ECQT is a biannual event where scientists and technicians in queueing theory and related areas get together to promote research, encourage interaction and exchange ideas. The spirit of the conference is to be a queueing event organized from within Europe, but open to participants from all over the world. The technical program of the 2016 edition consisted of 112 presentations organized in 29 sessions covering all trends in queueing theory, including the development of the theory, methodology advances, computational aspects and applications. Another exciting feature of ECQT2016 was the institution of the Takács Award for outstanding PhD thesis on "Queueing Theory and its Applications"

    The History of the Quantitative Methods in Finance Conference Series. 1992-2007

    Get PDF
    This report charts the history of the Quantitative Methods in Finance (QMF) conference from its beginning in 1993 to the 15th conference in 2007. It lists alphabetically the 1037 speakers who presented at all 15 conferences and the titles of their papers.

    Predictive Maneuver Planning and Control of an Autonomous Vehicle in Multi-Vehicle Traffic with Observation Uncertainty

    Get PDF
    Autonomous vehicle technology is a promising development for improving the safety, efficiency and environmental impact of on-road transportation systems. However, the task of guiding an autonomous vehicle by rapidly and systematically accommodating the plethora of changing constraints, e.g. of avoiding multiple stationary and moving obstacles, obeying traffic rules, signals and so on as well as the uncertain state observation due to sensor imperfections, remains a major challenge. This dissertation attempts to address this challenge via designing a robust and efficient predictive motion planning framework that can generate the appropriate vehicle maneuvers (selecting and tracking specific lanes, and related speed references) as well as the constituent motion trajectories while considering the differential vehicle kinematics of the controlled vehicle and other constraints of operating in public traffic. The main framework combines a finite state machine (FSM)-based maneuver decision module with a model predictive control (MPC)-based trajectory planner. Based on the prediction of the traffic environment, reference speeds are assigned to each lane in accordance with the detection of objects during measurement update. The lane selection decisions themselves are then incorporated within the MPC optimization. The on-line maneuver/motion planning effort for autonomous vehicles in public traffic is a non-convex problem due to the multiple collision avoidance constraints with overlapping areas, lane boundaries, and nonlinear vehicle-road dynamics constraints. This dissertation proposes and derives some remedies for these challenges within the planning framework to improve the feasibility and optimality of the solution. Specifically, it introduces vehicle grouping notions and derives conservative and smooth algebraic models to describe the overlapped space of several individual infeasible spaces and help prevent the optimization from falling into undesired local minima. Furthermore, in certain situations, a forced objective selection strategy is needed and adopted to help the optimization jump out of local minima. Furthermore, the dissertation considers stochastic uncertainties prevalent in dynamic and complex traffic and incorporate them with in the predictive planning and control framework. To this end, Bayesian filters are implemented to estimate the uncertainties in object motions and then propagate them into the prediction horizon. Then, a pair-wise probabilistic collision condition is defined for objects with non-negligible geometrical shape/sizes and computationally efficient and conservative forms are derived to efficiently and analytically approximate the involved multi-variate integrals. The probabilistic collision evaluation is then applied within a vehicle grouping algorithms to cluster the object vehicles with closeness in positions and speeds and eventually within the stochastic predictive maneuver planner framework to tighten the chanced-constraints given a deterministic confidence margin. It is argued that these steps make the planning problem tractable for real-time implementation on autonomously controlled vehicles

    Semantic array programming in data-poor environments: assessing the interactions of shallow landslides and soil erosion

    Get PDF
    This research was conducted with the main objective to better integrate and quantify the role of water-induced shallow landslides within soil erosion processes, with a particular focus on data-poor conditions. To fulfil the objectives, catchment-scale studies on soil erosion by water and shallow landslides were conducted. A semi-quantitative method that combines heuristic, deterministic and probabilistic approaches is here proposed for a robust catchment-scale assessment of landslide susceptibility when available data are scarce. A set of different susceptibility-zonation maps was aggregated exploiting a modelling ensemble. Each susceptibility zonation has been obtained by applying heterogeneous statistical techniques such as logistic regression (LR), relative distance similarity (RDS), artificial neural network (ANN), and two different landslide-susceptibility techniques based on the infinite slope stability model. The good performance of the ensemble model, when compared with the single techniques, make this method suitable to be applied in data-poor areas where the lack of proper calibration and validation data can affect the application of physically based or conceptual models. A new modelling architecture to support the integrated assessment of soil erosion, by incorporating rainfall induced shallow landslides processes in data-poor conditions, was developed and tested in the study area. This proposed methodology is based on the geospatial semantic array programming paradigm. The integrated data-transformation model relies on a modular architecture, where the information flow among modules is constrained by semantic checks. By analysing modelling results within the study catchment, each year, on average, mass movements are responsible for a mean increase in the total soil erosion rate between 22 and 26% over the pre-failure estimate. The post-failure soil erosion rate in areas where landslides occurred is, on average, around 3.5 times the pre-failure value. These results confirm the importance to integrate landslide contribution into soil erosion modelling. Because the estimation of the changes in soil erosion from landslide activity is largely dependent on the quality of available datasets, this methodology broadens the possibility of a quantitative assessment of these effects in data-poor regions
    corecore