409 research outputs found

    Multi-Scale Methodologies for Probabilistic Resilience Assessment and Enhancement of Bridges and Transportation Systems

    Get PDF
    When an extreme event occurs, such as an earthquake or a tsunami, the amount of socioeconomic losses due to reduced functionality of infrastructure systems over time is comparable to or even higher than the immediate loss due to the extreme event itself. Therefore, one of the highest priorities of owners, disaster management officials, and decision makers in general is to have a prediction of the disaster performance of lifelines and infrastructures a priory considering different scenarios, and be able to restore the functionality in an efficient manner to the normal condition, or at least to an acceptable level during the emergency, in the aftermath of a catastrophe. Along the line of this need, academic research has been focused on the concept of infrastructure resilience, which reflects the ability of structures, infrastructure systems, and communities to both withstand against and quickly recover functionality after an extreme event. Among infrastructure systems, transportation networks are of utmost importance as they allow people to move from damaged to safe areas and rescue/recovery teams to effectively accomplish their mission. Moreover, the functionality and restoration of several other infrastructure systems and socio-economic units of the community is highly interdependent with transportation network performance. Among different components of transportation networks, bridges are among of the most vulnerable and need a particular attention. To this respect, this research is mostly focused on quantification, and optimization of the functionality and resilience of bridges and transportation networks in the aftermath of extreme events, and in particular earthquakes, considering the underlying uncertainties. The scope of the study includes: (i) accurate\efficient assessment of the seismic fragility of individual bridges; (ii) development of a technique for assessment of bridge functionality and its probabilistic characteristics following an earthquake and during the restoration process; (iii) development of efficient optimization techniques for post-event restoration and pre-event retrofit prioritization of bridges; (iv) development of metrics and formulations for realistic quantification of the functionality and resilience of bridges and transportation networks.The evaluation of the damage and its probabilistic characteristics is the first step towards the assessment of the functionality of a bridge. In this regard, a simulation-based methodology was introduced for probabilistic seismic demand and fragility analyses, aimed at improving the accuracy of the resilience and life-cycle loss assessment of highway bridges. The impact of different assumptions made on the demand was assessed to determine if they are acceptable. The results show that among different assumptions, the power model and constant dispersion assumption introduce a considerable amount of error to the estimated probabilistic characteristics of demand and fragility. The error can be prevented using the introduced simulation-based technique, which takes advantage of the computational resources widely available nowadays.A new framework was presented to estimate probabilistic restoration functions of damaged bridges. This was accomplished by simulating different restoration project scenarios, considering the construction methods common in practice and the amount of resource availability. Moreover, two scheduling schemes were proposed to handle the uncertainties in the project scheduling and planning. The application of the proposed methodology was presented for the case of a bridge under a seismic scenario. The results show the critical impact of temporary repair solutions (e.g., temporary shoring) on the probabilistic characteristics of the functionality of the bridge during the restoration. Thus, the consideration of such solutions in probabilistic functionality and resilience analyses of bridges is necessary. Also, a considerable amount of nonlinearity was recognized among the restoration resource availability, duration of the restoration, and the bridge functionality level during the restoration process.A new tool called “Functionality-Fragility Surface” (FFS) was introduced for pre-event probabilistic recovery and resilience prediction of damaged structure, infrastructure systems, and communities. FFS combines fragility and restoration functions and presents the probability of suffering a certain functionality loss after a certain time elapsed from the occurrence of the extreme event, and given the intensity of the event. FFSs were developed for an archetype bridge to showcase the application of the proposed tool and formulation. Regarding network level analysis, a novel evolutionary optimization methodology for scheduling independent tasks considering resource and time constraints was proposed. The application of the proposed methodology to multi-phase optimal resilience restoration of highway bridges was presented and discussed. The results show the superior performance of the presented technique compared to other formulations both in terms of convergence rate and optimality of the solution. Also, the computed resilience-optimal restoration schedules are more practical and easier to interpret. Moreover, new connectivity-based metrics were introduced to measure the functionality and resilience of transportation networks, to take into account the priorities typically considered during the medium term of the disaster management.A two-level simulation-based optimization framework for bridge retrofit prioritization is presented. The objectives of the upper-level optimization are the minimization of the cost of bridge retrofit strategy, and probabilistic resilience failure defined as the probability of post-event optimal resilience being less than a critical value. The combined effect of the uncertainties in the seismic event characteristics and resulting damage state of bridges are taken into account by using an advanced efficient sampling technique, and fragility analysis. The proposed methodology was applied to a transportation network and different optimal bridge retrofit strategies were computed. The technique showed to be effective and efficient in computing the optimal bridge retrofit solutions of the example transportation network

    Quantum Natural Language Generation on Near-Term Devices

    Full text link
    The emergence of noisy medium-scale quantum devices has led to proof-of-concept applications for quantum computing in various domains. Examples include Natural Language Processing (NLP) where sentence classification experiments have been carried out, as well as procedural generation, where tasks such as geopolitical map creation, and image manipulation have been performed. We explore applications at the intersection of these two areas by designing a hybrid quantum-classical algorithm for sentence generation. Our algorithm is based on the well-known simulated annealing technique for combinatorial optimisation. An implementation is provided and used to demonstrate successful sentence generation on both simulated and real quantum hardware. A variant of our algorithm can also be used for music generation. This paper aims to be self-contained, introducing all the necessary background on NLP and quantum computing along the way.Comment: To appear in proceedings of International Natural Language Generation Conference (INLG) 202

    Creating a useful vascular center: a statewide survey of what primary care physicians really want

    Get PDF
    AbstractObjectiveMultidisciplinary vascular centers (VCs) have been proposed to integrate vascular patient care. No studies, however, have assessed referring physician interest or which services should be provided. A statewide survey of primary care physicians (PCPs) was performed to answer these questions.MethodsQuestionnaires were mailed to 3711 PCPs, asking about familiarity with vascular disease, potential VC usage, and services VCs should provide. Univariate and multivariate analysis was used to determine which PCPs would refer patients, the services desired, and which patients would be referred.ResultsOf 1006 PCPs who responded, 66% would refer patients to a VC, especially patients younger than 50 years (P < .001) and those with lower extremity disease (P < .001) or abdominal aortic aneurysm (P < .001). PCPs practicing within 50 miles of a VC (P < .001), those in practice less than 5 years (P < .001), and those without specific training in vascular disease during residency (P = .004) were most likely to refer patients. Vascular surgery (97%), interventional radiology (90%), and a noninvasive vascular laboratory (82%) were considered the most important services, and physician educational services (62%) were also desirable. PCPs did not think cardiology, cardiac surgery, smoking cessation programs, or diabetes or lipid management are needed. Reasons for VC nonuse included travel distance (23%), sufficient local services (21%), and insurance issues (12%). Only 16% of PCPs believe that their patients with vascular disease currently receive optimal care.ConclusionThere is considerable interest in VCs among PCPs. In contrast to recently described models, VCs need not incorporate cardiology, cardiac surgery, smoking cessation programs, or diabetes or lipid management. VCs should include vascular surgery, interventional radiology, a noninvasive vascular laboratory, and physician educational services

    Haematological and infectious complications associated with the treatment of patients with congenital cardiac disease: consensus definitions from the Multi-Societal Database Committee for Pediatric and Congenital Heart Disease

    Get PDF
    A complication is an event or occurrence that is associated with a disease or a healthcare intervention, is a departure from the desired course of events, and may cause, or be associated with, suboptimal outcome. A complication does not necessarily represent a breech in the standard of care that constitutes medical negligence or medical malpractice. An operative or procedural complication is any complication, regardless of cause, occurring (1) within 30 days after surgery or intervention in or out of the hospital, or (2) after 30 days during the same hospitalization subsequent to the operation or intervention. Operative and procedural complications include both intraoperative/intraprocedural complications and postoperative/postprocedural complications in this time interval. The MultiSocietal Database Committee for Pediatric and Congenital Heart Disease has set forth a comprehensive list of complications associated with the treatment of patients with congenital cardiac disease, related to cardiac, pulmonary, renal, haematological, infectious, neurological, gastrointestinal, and endocrinal systems, as well as those related to the management of anaesthesia and perfusion, and the transplantation of thoracic organs. The objective of this manuscript is to examine the definitions of operative morbidity as they relate specifically to the haematological system and to infectious complications. These specific definitions and terms will be used to track morbidity associated with surgical and transcatheter interventions and other forms of therapy in a common language across many separate databases. The MultiSocietal Database Committee for Pediatric and Congenital Heart Disease has prepared and defined a near-exhaustive list of haematological and infectious complications. Within each subgroup, complications are presented in alphabetical order. Clinicians caring for patients with congenital cardiac disease will be able to use this list for databases, quality improvement initiatives, reporting of complications, and comparing strategies for treatmen

    Comparison of the Effects of Er, Cr: YSGG Laser and Super-Saturated Citric Acid on the Debridement of Contaminated Implant Surfaces

    Get PDF
    Introduction: Several techniques such as using citric acid, plastic curettes, ultrasonic devices, and lasers have been suggested for debridement of contaminated implant surfaces. This comparative investigation aimed to assess and compare the effects of Er, Cr: YSGG laser and super-saturated citric acid on the debridement of contaminated dental implant surfaces.Methods: In this in-vitro study, 12 contaminated failed implants were collected and randomly divided into 2 groups (6 in group A, and 6 in group B). Also, one implant was considered as the control. The implants were horizontally sectioned into coronal and apical portions and subsequently irradiated by Er, Cr: YSGG laser in coronal and citric acid in apical in group A and the opposite in group B. In order to evaluate the effect of water spray on the laser section, half the laser portion of the implants was irradiated using water, while the other half was irradiated without water with an irradiation time of 1 minute.Results: Results revealed that calculus and plaque removal was greater in the laser part of both groups (with and without water) compared to citric acid parts and the correlation between calculus removal and surface roughness were statistically significant. Furthermore, the surface roughness in the citric acid parts was significantly higher than in laser parts. Water spray during irradiation had a very small influence on understudy factors.Conclusion: Based on the results of this study, the Er, Cr: YSGG laser was more effective in calculus removal and caused less surface roughness compared with citric acid application

    Analyzing the Performance of Variational Quantum Factoring on a Superconducting Quantum Processor

    Full text link
    In the near-term, hybrid quantum-classical algorithms hold great potential for outperforming classical approaches. Understanding how these two computing paradigms work in tandem is critical for identifying areas where such hybrid algorithms could provide a quantum advantage. In this work, we study a QAOA-based quantum optimization algorithm by implementing the Variational Quantum Factoring (VQF) algorithm. We execute experimental demonstrations using a superconducting quantum processor and investigate the trade-off between quantum resources (number of qubits and circuit depth) and the probability that a given biprime is successfully factored. In our experiments, the integers 1099551473989, 3127, and 6557 are factored with 3, 4, and 5 qubits, respectively, using a QAOA ansatz with up to 8 layers and we are able to identify the optimal number of circuit layers for a given instance to maximize success probability. Furthermore, we demonstrate the impact of different noise sources on the performance of QAOA and reveal the coherent error caused by the residual ZZ-coupling between qubits as a dominant source of error in the superconducting quantum processor

    Impact of ionizing radiation on superconducting qubit coherence

    Full text link
    The practical viability of any qubit technology stands on long coherence times and high-fidelity operations, with the superconducting qubit modality being a leading example. However, superconducting qubit coherence is impacted by broken Cooper pairs, referred to as quasiparticles, with a density that is empirically observed to be orders of magnitude greater than the value predicted for thermal equilibrium by the Bardeen-Cooper-Schrieffer (BCS) theory of superconductivity. Previous work has shown that infrared photons significantly increase the quasiparticle density, yet even in the best isolated systems, it still remains higher than expected, suggesting that another generation mechanism exists. In this Letter, we provide evidence that ionizing radiation from environmental radioactive materials and cosmic rays contributes to this observed difference, leading to an elevated quasiparticle density that would ultimately limit superconducting qubits of the type measured here to coherence times in the millisecond regime. We further demonstrate that introducing radiation shielding reduces the flux of ionizing radiation and positively correlates with increased coherence time. Albeit a small effect for today's qubits, reducing or otherwise mitigating the impact of ionizing radiation will be critical for realizing fault-tolerant superconducting quantum computers.Comment: 16 pages, 12 figure

    Is there a relationship between surgical case volume and mortality in congenital heart disease services? A rapid evidence review.

    Get PDF
    OBJECTIVE: To identify and synthesise the evidence on the relationship between surgical volume and patient outcomes for adults and children with congenital heart disease. DESIGN: Evidence synthesis of interventional and observational studies. DATA SOURCES: MEDLINE, EMBASE, CINAHL, Cochrane Library and Web of Science (2009-2014) and citation searching, reference lists and recommendations from stakeholders (2003-2014) were used to identify evidence. STUDY SELECTION: Quantitative observational and interventional studies with information on volume of surgical procedures and patient outcomes were included. RESULTS: 31 of the 34 papers identified (91.2%) included only paediatric patients. 25 (73.5%) investigated the relationship between volume and mortality, 7 (20.6%) mortality and other outcomes and 2 (5.9%) non-mortality outcomes only. 88.2% were from the US, 97% were multicentre studies and all were retrospective observational studies. 20 studies (58.8%) included all congenital heart disease conditions and 14 (41.2%) single conditions or procedures. No UK studies were identified. Most studies showed a relationship between volume and outcome but this relationship was not consistent. The relationship was stronger for single complex conditions or procedures. We found limited evidence about the impact of volume on non-mortality outcomes. A mixed picture emerged revealing a range of factors, in addition to volume, that influence outcome including condition severity, individual centre and surgeon effects and clinical advances over time. CONCLUSIONS: The heterogeneity of findings from observational studies suggests that, while a relationship between volume and outcome exists, this is unlikely to be a simple, independent and directly causal relationship. The effect of volume on outcome relative to the effect of other, as yet undetermined, health system factors remains a complex and unresolved research question

    Learning-based Calibration of Flux Crosstalk in Transmon Qubit Arrays

    Full text link
    Superconducting quantum processors comprising flux-tunable data and coupler qubits are a promising platform for quantum computation. However, magnetic flux crosstalk between the flux-control lines and the constituent qubits impedes precision control of qubit frequencies, presenting a challenge to scaling this platform. In order to implement high-fidelity digital and analog quantum operations, one must characterize the flux crosstalk and compensate for it. In this work, we introduce a learning-based calibration protocol and demonstrate its experimental performance by calibrating an array of 16 flux-tunable transmon qubits. To demonstrate the extensibility of our protocol, we simulate the crosstalk matrix learning procedure for larger arrays of transmon qubits. We observe an empirically linear scaling with system size, while maintaining a median qubit frequency error below 300300 kHz
    corecore