41,453 research outputs found

    Adaptive Importance Sampling for Performance Evaluation and Parameter Optimization of Communication Systems

    Get PDF
    We present new adaptive importance sampling techniques based on stochastic Newton recursions. Their applicability to the performance evaluation of communication systems is studied. Besides bit-error rate (BER) estimation, the techniques are used for system parameter optimization. Two system models that are analytically tractable are employed to demonstrate the validity of the techniques. As an application to situations that are analytically intractable and numerically intensive, the influence of crosstalk in a wavelength-division multiplexing (WDM) crossconnect is assessed. In order to consider a realistic system model, optimal setting of thresholds in the detector is carried out while estimating error rate performances. Resulting BER estimates indicate that the tolerable crosstalk levels are significantly higher than predicted in the literature. This finding has a strong impact on the design of WDM networks. Power penalties induced by the addition of channels can also be accurately predicted in short run-time

    Operational risk management and new computational needs in banks

    Get PDF
    Basel II banking regulation introduces new needs for computational schemes. They involve both optimal stochastic control, and large scale simulations of decision processes of preventing low-frequency high loss-impact events. This paper will first state the problem and present its parameters. It then spells out the equations that represent a rational risk management behavior and link together the variables: Levy processes are used to model operational risk losses, where calibration by historical loss databases is possible ; where it is not the case, qualitative variables such as quality of business environment and internal controls can provide both costs-side and profits-side impacts. Among other control variables are business growth rate, and efficiency of risk mitigation. The economic value of a policy is maximized by resolving the resulting Hamilton-Jacobi-Bellman type equation. Computational complexity arises from embedded interactions between 3 levels: * Programming global optimal dynamic expenditures budget in Basel II context, * Arbitraging between the cost of risk-reduction policies (as measured by organizational qualitative scorecards and insurance buying) and the impact of incurred losses themselves. This implies modeling the efficiency of the process through which forward-looking measures of threats minimization, can actually reduce stochastic losses, * And optimal allocation according to profitability across subsidiaries and business lines. The paper next reviews the different types of approaches that can be envisaged in deriving a sound budgetary policy solution for operational risk management, based on this HJB equation. It is argued that while this complex, high dimensional problem can be resolved by taking some usual simplifications (Galerkin approach, imposing Merton form solutions, viscosity approach, ad hoc utility functions that provide closed form solutions, etc.) , the main interest of this model lies in exploring the scenarios in an adaptive learning framework ( MDP, partially observed MDP, Q-learning, neuro-dynamic programming, greedy algorithm, etc.). This makes more sense from a management point of view, and solutions are more easily communicated to, and accepted by, the operational level staff in banks through the explicit scenarios that can be derived. This kind of approach combines different computational techniques such as POMDP, stochastic control theory and learning algorithms under uncertainty and incomplete information. The paper concludes by presenting the benefits of such a consistent computational approach to managing budgets, as opposed to a policy of operational risk management made up from disconnected expenditures. Such consistency satisfies the qualifying criteria for banks to apply for the AMA (Advanced Measurement Approach) that will allow large economies of regulatory capital charge under Basel II Accord.REGULAR - Operational risk management, HJB equation, Levy processes, budget optimization, capital allocation

    Exploration of Reaction Pathways and Chemical Transformation Networks

    Full text link
    For the investigation of chemical reaction networks, the identification of all relevant intermediates and elementary reactions is mandatory. Many algorithmic approaches exist that perform explorations efficiently and automatedly. These approaches differ in their application range, the level of completeness of the exploration, as well as the amount of heuristics and human intervention required. Here, we describe and compare the different approaches based on these criteria. Future directions leveraging the strengths of chemical heuristics, human interaction, and physical rigor are discussed.Comment: 48 pages, 4 figure

    Study of application of adaptive systems to the exploration of the solar system. Volume 1: Summary

    Get PDF
    The field of artificial intelligence to identify practical applications to unmanned spacecraft used to explore the solar system in the decade of the 80s is examined. If an unmanned spacecraft can be made to adjust or adapt to the environment, to make decisions about what it measures and how it uses and reports the data, it can become a much more powerful tool for the science community in unlocking the secrets of the solar system. Within this definition of an adaptive spacecraft or system, there is a broad range of variability. In terms of sophistication, an adaptive system can be extremely simple or as complex as a chess-playing machine that learns from its mistakes

    Reliability analysis of discrete-state performance functions via adaptive sequential sampling with detection of failure surfaces

    Full text link
    The paper presents a new efficient and robust method for rare event probability estimation for computational models of an engineering product or a process returning categorical information only, for example, either success or failure. For such models, most of the methods designed for the estimation of failure probability, which use the numerical value of the outcome to compute gradients or to estimate the proximity to the failure surface, cannot be applied. Even if the performance function provides more than just binary output, the state of the system may be a non-smooth or even a discontinuous function defined in the domain of continuous input variables. In these cases, the classical gradient-based methods usually fail. We propose a simple yet efficient algorithm, which performs a sequential adaptive selection of points from the input domain of random variables to extend and refine a simple distance-based surrogate model. Two different tasks can be accomplished at any stage of sequential sampling: (i) estimation of the failure probability, and (ii) selection of the best possible candidate for the subsequent model evaluation if further improvement is necessary. The proposed criterion for selecting the next point for model evaluation maximizes the expected probability classified by using the candidate. Therefore, the perfect balance between global exploration and local exploitation is maintained automatically. The method can estimate the probabilities of multiple failure types. Moreover, when the numerical value of model evaluation can be used to build a smooth surrogate, the algorithm can accommodate this information to increase the accuracy of the estimated probabilities. Lastly, we define a new simple yet general geometrical measure of the global sensitivity of the rare-event probability to individual variables, which is obtained as a by-product of the proposed algorithm.Comment: Manuscript CMAME-D-22-00532R1 (Computer Methods in Applied Mechanics and Engineering

    Introgressive Hybridization and the Evolution of Lake-Adapted Catostomid Fishes.

    Get PDF
    Hybridization has been identified as a significant factor in the evolution of plants as groups of interbreeding species retain their phenotypic integrity despite gene exchange among forms. Recent studies have identified similar interactions in animals; however, the role of hybridization in the evolution of animals has been contested. Here we examine patterns of gene flow among four species of catostomid fishes from the Klamath and Rogue rivers using molecular and morphological traits. Catostomus rimiculus from the Rogue and Klamath basins represent a monophyletic group for nuclear and morphological traits; however, the Klamath form shares mtDNA lineages with other Klamath Basin species (C. snyderi, Chasmistes brevirostris, Deltistes luxatus). Within other Klamath Basin taxa, D. luxatus was largely fixed for alternate nuclear alleles relative to C. rimiculus, while Ch. brevirostris and C. snyderi exhibited a mixture of these alleles. Deltistes luxatus was the only Klamath Basin species that exhibited consistent covariation of nuclear and mitochondrial traits and was the primary source of mismatched mtDNA in Ch. brevirostris and C. snyderi, suggesting asymmetrical introgression into the latter species. In Upper Klamath Lake, D. luxatus spawning was more likely to overlap spatially and temporally with C. snyderi and Ch. brevirostris than either of those two with each other. The latter two species could not be distinguished with any molecular markers but were morphologically diagnosable in Upper Klamath Lake, where they were largely spatially and temporally segregated during spawning. We examine parallel evolution and syngameon hypotheses and conclude that observed patterns are most easily explained by introgressive hybridization among Klamath Basin catostomids
    • …
    corecore