302 research outputs found

    Multifidelity conditional value-at-risk estimation by dimensionally decomposed generalized polynomial chaos-Kriging

    Full text link
    We propose novel methods for Conditional Value-at-Risk (CVaR) estimation for nonlinear systems under high-dimensional dependent random inputs. We develop a novel DD-GPCE-Kriging surrogate that merges dimensionally decomposed generalized polynomial chaos expansion and Kriging to accurately approximate nonlinear and nonsmooth random outputs. We use DD-GPCE-Kriging (1) for Monte Carlo simulation (MCS) and (2) within multifidelity importance sampling (MFIS). The MCS-based method samples from DD-GPCE-Kriging, which is efficient and accurate for high-dimensional dependent random inputs, yet introduces bias. Thus, we propose an MFIS-based method where DD-GPCE-Kriging determines the biasing density, from which we draw a few high-fidelity samples to provide an unbiased CVaR estimate. To accelerate the biasing density construction, we compute DD-GPCE-Kriging using a cheap-to-evaluate low-fidelity model. Numerical results for mathematical functions show that the MFIS-based method is more accurate than the MCS-based method when the output is nonsmooth. The scalability of the proposed methods and their applicability to complex engineering problems are demonstrated on a two-dimensional composite laminate with 28 (partly dependent) random inputs and a three-dimensional composite T-joint with 20 (partly dependent) random inputs. In the former, the proposed MFIS-based method achieves 104x speedup compared to standard MCS using the high-fidelity model, while accurately estimating CVaR with 1.15% error.Comment: 34 pages, 8 figures, research pape

    The Combined Effects of Co-Culture and Substrate Mechanics on 3D Tumor Spheroid Formation within Microgels Prepared via Flow-Focusing Microfluidic Fabrication

    Get PDF
    Tumor spheroids are considered a valuable three dimensional (3D) tissue model to study various aspects of tumor physiology for biomedical applications such as tissue engineering and drug screening as well as basic scientific endeavors, as several cell types can efficiently form spheroids by themselves in both suspension and adherent cell cultures. However, it is more desirable to utilize a 3D scaffold with tunable properties to create more physiologically relevant tumor spheroids as well as optimize their formation. In this study, bioactive spherical microgels supporting 3D cell culture are fabricated by a flow-focusing microfluidic device. Uniform-sized aqueous droplets of gel precursor solution dispersed with cells generated by the microfluidic device are photocrosslinked to fabricate cell-laden microgels. Their mechanical properties are controlled by the concentration of gel-forming polymer. Using breast adenocarcinoma cells, MCF-7, the effect of mechanical properties of microgels on their proliferation and the eventual spheroid formation was explored. Furthermore, the tumor cells are co-cultured with macrophages of fibroblasts, which are known to play a prominent role in tumor physiology, within the microgels to explore their role in spheroid formation. Taken together, the results from this study provide the design strategy for creating tumor spheroids utilizing mechanically-tunable microgels as 3D cell culture platform

    Randomly Monitored Quantum Codes

    Full text link
    Quantum measurement has conventionally been regarded as the final step in quantum information processing, which is essential for reading out the processed information but collapses the quantum state into a classical state. However, recent studies have shown that quantum measurement itself can induce novel quantum phenomena. One seminal example is a monitored random circuit, which can generate long-range entanglement faster than a random unitary circuit. Inspired by these results, in this paper, we address the following question: When quantum information is encoded in a quantum error-correcting code, how many physical qubits should be randomly measured to destroy the encoded information? We investigate this question for various quantum error-correcting codes and derive the necessary and sufficient conditions for destroying the information through measurements. In particular, we demonstrate that for a large class of quantum error-correcitng codes, it is impossible to destroy the encoded information through random single-qubit Pauli measurements when a tiny portion of physical qubits is still unmeasured. Our results not only reveal the extraordinary robustness of quantum codes under measurement decoherence, but also suggest potential applications in quantum information processing tasks.Comment: 30 page

    Bi-fidelity conditional-value-at-risk estimation by dimensionally decomposed generalized polynomial chaos expansion

    Full text link
    Digital twin models allow us to continuously assess the possible risk of damage and failure of a complex system. Yet high-fidelity digital twin models can be computationally expensive, making quick-turnaround assessment challenging. Towards this goal, this article proposes a novel bi-fidelity method for estimating the conditional value-at-risk (CVaR) for nonlinear systems subject to dependent and high-dimensional inputs. For models that can be evaluated fast, a method that integrates the dimensionally decomposed generalized polynomial chaos expansion (DD-GPCE) approximation with a standard sampling-based CVaR estimation is proposed. For expensive-to-evaluate models, a new bi-fidelity method is proposed that couples the DD-GPCE with a Fourier-polynomial expansions of the mapping between the stochastic low-fidelity and high-fidelity output data to ensure computational efficiency. The method employs a measure-consistent orthonormal polynomial in the random variable of the low-fidelity output to approximate the high-fidelity output. Numerical results for a structural mechanics truss with 36-dimensional (dependent random variable) inputs indicate that the DD-GPCE method provides very accurate CVaR estimates that require much lower computational effort than standard GPCE approximations. A second example considers the realistic problem of estimating the risk of damage to a fiber-reinforced composite laminate. The high-fidelity model is a finite element simulation that is prohibitively expensive for risk analysis, such as CVaR computation. Here, the novel bi-fidelity method can accurately estimate CVaR as it includes low-fidelity models in the estimation procedure and uses only a few high-fidelity model evaluations to significantly increase accuracy.Comment: Added acknowledgmen

    Spear and Shield: Adversarial Attacks and Defense Methods for Model-Based Link Prediction on Continuous-Time Dynamic Graphs

    Full text link
    Real-world graphs are dynamic, constantly evolving with new interactions, such as financial transactions in financial networks. Temporal Graph Neural Networks (TGNNs) have been developed to effectively capture the evolving patterns in dynamic graphs. While these models have demonstrated their superiority, being widely adopted in various important fields, their vulnerabilities against adversarial attacks remain largely unexplored. In this paper, we propose T-SPEAR, a simple and effective adversarial attack method for link prediction on continuous-time dynamic graphs, focusing on investigating the vulnerabilities of TGNNs. Specifically, before the training procedure of a victim model, which is a TGNN for link prediction, we inject edge perturbations to the data that are unnoticeable in terms of the four constraints we propose, and yet effective enough to cause malfunction of the victim model. Moreover, we propose a robust training approach T-SHIELD to mitigate the impact of adversarial attacks. By using edge filtering and enforcing temporal smoothness to node embeddings, we enhance the robustness of the victim model. Our experimental study shows that T-SPEAR significantly degrades the victim model's performance on link prediction tasks, and even more, our attacks are transferable to other TGNNs, which differ from the victim model assumed by the attacker. Moreover, we demonstrate that T-SHIELD effectively filters out adversarial edges and exhibits robustness against adversarial attacks, surpassing the link prediction performance of the naive TGNN by up to 11.2% under T-SPEAR

    A Framework for Measuring the Performance and Power Consumption of Storage Components under Typical Workload

    Get PDF
    Although the cost of storage components are reportedaccurately by the vendors, it is not clear whether the performance (IOps,MiBps) and power consumption (W) specifications they provide are accurateunder ‘typical’ workloads. Accurately measuring this informationis a vital step in providing input for optimal storage systems design. Thispaper measures storage disk performance and power consumption using‘typical’ workloads. The workloads are generated using an open sourceversion of the (industry standard) SPC-1 benchmark. This benchmarkcreates a realistic synthetic workload that aggregates multiple usersutilizing data storage simultaneously. A flexible current sensor board hasalso been developed to measure various storage devices simultaneously.This work represents a significant contribution to data storage benchmarkingresources (both performance and power consumption) as wehave embedded the open source SPC-1 benchmark spc1 within an opensource workload generator fio, in addition to our flexible current sensordevelopment. The integration provides an easily available benchmark forresearchers developing new storage technologies. This benchmark shouldgive a reasonable estimation of performance with the official SPC-1benchmark for systems that do not yet fulfill all the requirements for anofficial SPC-1 benchmark. With accurate information, our frameworkshows promise in alleviating much of the complexity in future storagesystems design
    corecore