366 research outputs found
A COEXCEEDANCE APPROACH ON FINANCIAL CONTAGION
The paper sheds light on financial contagion within the Euro Area and Asia, and contagion from the Euro Area to Asia during two recent crises: the global financial crisis and European sovereign debt crisis. Applying the multinomial logit regression model, the paper investigates how the macro-finance variables affect the coincidence of extreme negative returns (coexceedances). In addition, I apply both original constant threshold i.e. 5% percentile of unconditional distribution of daily stock returns and Value-at-Risk to estimate extreme negative returns. These approaches offer a similar pattern. The empirical findings reveal that, in the Euro Area and Asia, the probability of the occurrence of coexceedances is strongly explained by the idiosyncratic risks: the changes in exchange rates, the regional stock market volatility, and global shocks: the changes in the U.S. long-term interest rates, the TED spread. The global volatility index is only significant to explain the likelihood of coexceedances in the Euro Area, not in Asia. These analyses lead to the conclusion that contagion in Asia is more important than in the Euro Area. Another important finding indicates the existence of contagion from the Euro Area to Asia. That is, the probability of coexceedances in Asia is predictable and depends on the number of joint occurrence of extreme return shocks in the Euro Area
Optimal control of large quantum systems: assessing memory and runtime performance of GRAPE
Gradient Ascent Pulse Engineering (GRAPE) is a popular technique in quantum
optimal control, and can be combined with automatic differentiation (AD) to
facilitate on-the-fly evaluation of cost-function gradients. We illustrate that
the convenience of AD comes at a significant memory cost due to the cumulative
storage of a large number of states and propagators. For quantum systems of
increasing Hilbert space size, this imposes a significant bottleneck. We
revisit the strategy of hard-coding gradients in a scheme that fully avoids
propagator storage and significantly reduces memory requirements. Separately,
we present improvements to numerical state propagation to enhance runtime
performance. We benchmark runtime and memory usage and compare this approach to
AD-based implementations, with a focus on pushing towards larger Hilbert space
sizes. The results confirm that the AD-free approach facilitates the
application of optimal control for large quantum systems which would otherwise
be difficult to tackle.Comment: 14 pages, 6 figures, 1 tabl
Impact of sea level rise on current and wave in Van Uc coastal area
This paper presents the results of analysis, comparison of some characteristics of current, wave at Van Uc estuary area when being affected by sea level rise due to climate change based on Delft3D model. Scenario groups are established: The current scenario and the scenarios simulating effect of sea level rise 0.5 m and 1.0 m. The results of calculation and simulation show that the velocity values change locally when sea level rises: Rise in the northern and southern areas (0.2–5 cm/s); decrease in the navigation channel (0.6–30 cm/s). Sea level rise causes the increase of wave height in the coastal area (13.5–43.8% in the dry season and 20–40% in the rainy season) and fewer changes in the outer area
Environment Impacts and Composition - Biotoxic Activity in Natural Hydrocarbon Raw Materials & Processed Products
The paper aims to figure out What are environment impacts and composition - biotoxic activity in natural hydrocarbon raw materials and processed products. By using descriptive method for primary model, synthesis methods and process analysis and analysis of difficulties and discussion, The study of this problem point that, to assess the environmental impact of hydrocarbons on the biosphere, an accurate knowledge of the physical properties and chemical composition of oil and gas. Numerous tragic examples of accidents, such as those associated with a leak hydrogen sulfide gases. Hydrogen sulfide, due to its higher density relative to air settles in the lowlands of the relief, accumulating in calm weather in concentrations up to lethal. This leads to the death of animals and the death of people. The latter could be avoid by going to elevated windward areas, i.e., knowing the physical properties of this toxic gas
Two-view Graph Neural Networks for Knowledge Graph Completion
We present an effective GNN-based knowledge graph embedding model, named WGE,
to capture entity- and relation-focused graph structures. In particular, given
the knowledge graph, WGE builds a single undirected entity-focused graph that
views entities as nodes. In addition, WGE also constructs another single
undirected graph from relation-focused constraints, which views entities and
relations as nodes. WGE then proposes a GNN-based architecture to better learn
vector representations of entities and relations from these two single entity-
and relation-focused graphs. WGE feeds the learned entity and relation
representations into a weighted score function to return the triple scores for
knowledge graph completion. Experimental results show that WGE outperforms
competitive baselines, obtaining state-of-the-art performances on seven
benchmark datasets for knowledge graph completion.Comment: 13 pages; 3 tables; 3 figure
Semi-Supervised Semantic Segmentation using Redesigned Self-Training for White Blood Cells
Artificial Intelligence (AI) in healthcare, especially in white blood cell
cancer diagnosis, is hindered by two primary challenges: the lack of
large-scale labeled datasets for white blood cell (WBC) segmentation and
outdated segmentation methods. These challenges inhibit the development of more
accurate and modern techniques to diagnose cancer relating to white blood
cells. To address the first challenge, a semi-supervised learning framework
should be devised to efficiently capitalize on the scarcity of the dataset
available. In this work, we address this issue by proposing a novel
self-training pipeline with the incorporation of FixMatch. Self-training is a
technique that utilizes the model trained on labeled data to generate
pseudo-labels for the unlabeled data and then re-train on both of them.
FixMatch is a consistency-regularization algorithm to enforce the model's
robustness against variations in the input image. We discover that by
incorporating FixMatch in the self-training pipeline, the performance improves
in the majority of cases. Our performance achieved the best performance with
the self-training scheme with consistency on DeepLab-V3 architecture and
ResNet-50, reaching 90.69%, 87.37%, and 76.49% on Zheng 1, Zheng 2, and LISC
datasets, respectively
On-Chip Noise Sensor for Integrated Circuit Susceptibility Investigations
page number: 12International audienceWith the growing concerns about electromagnetic compatibility of integrated circuits, the need for accurate prediction tools and models to reduce risks of non-compliance becomes critical for circuit designers. However, on-chip characterization of noise is still necessary for model validation and design optimization. Although different on-chip measurement solutions have been proposed for emission issue characterization, no on-chip measurement methods have been proposed to address the susceptibility issues. This paper presents an on-chip noise sensor dedicated to the study of circuit susceptibility to electromagnetic interferences. A demonstration of the sensor measurement performances and benefits is proposed through a study of the susceptibility of a digital core to conducted interferences. Sensor measurements ensure a better characterization of actual coupling of interferences within the circuit and a diagnosis of failure origins
- …