379 research outputs found

    UK Housing Market: Time Series Processes with Independent and Identically Distributed Residuals

    Get PDF
    The paper examines whether a univariate data generating process can be identified which explains the data by having residuals that are independent and identically distributed, as verified by the BDS test. The stationary first differenced natural log quarterly house price index is regressed, initially with a constant variance and then with a conditional variance. The only regression function that produces independent and identically distributed standardised residuals is a mean process based on a pure random walk format with Exponential GARCH in mean for the conditional variance. There is an indication of an asymmetric volatility feedback effect but higher frequency data is required to confirm this. There could be scope for forecasting the index but this is tempered by the reduction in the power of the BDS test if there is a non-linear conditional variance process

    Progress in operational modeling in support of oil spill response

    Get PDF
    Following the 2010 Deepwater Horizon accident of a massive blow-out in the Gulf of Mexico, scientists from government, industry, and academia collaborated to advance oil spill modeling and share best practices in model algorithms, parameterizations, and application protocols. This synergy was greatly enhanced by research funded under the Gulf of Mexico Research Initiative (GoMRI), a 10-year enterprise that allowed unprecedented collection of observations and data products, novel experiments, and international collaborations that focused on the Gulf of Mexico, but resulted in the generation of scientific findings and tools of broader value. Operational oil spill modeling greatly benefited from research during the GoMRI decade. This paper provides a comprehensive synthesis of the related scientific advances, remaining challenges, and future outlook. Two main modeling components are discussed: Ocean circulation and oil spill models, to provide details on all attributes that contribute to the success and limitations of the integrated oil spill forecasts. These forecasts are discussed in tandem with uncertainty factors and methods to mitigate them. The paper focuses on operational aspects of oil spill modeling and forecasting, including examples of international operational center practices, observational needs, communication protocols, and promising new methodologies

    Strong Gravitational Lensing in a Charged Squashed Kaluza- Klein Black hole

    Full text link
    In this paper we investigate the strong gravitational lensing in a charged squashed Kaluza-Klein black hole. We suppose that the supermassive black hole in the galaxy center can be considered by a charged squashed Kaluza-Klein black hole and then we study the strong gravitational lensing theory and estimate the numerical values for parameters and observables of it. We explore the effects of the scale of extra dimension ρ0\rho_0 and the charge of black hole ρq\rho_q on these parameters and observables.Comment: 17 pages, 10 figure

    Characterizing sampling biases in the trace gas climatologies of the SPARC Data Initiative

    Get PDF
    Monthly zonal mean climatologies of atmospheric measurements from satellite instruments can have biases due to the non-uniform sampling of the atmosphere by the instruments. We characterize potential sampling biases in stratospheric trace gas climatologies of the Stratospheric Processes and their Role in Climate (SPARC) Data Initiative using chemical fields from a chemistry climate model simulation and sampling patterns from 16 satellite-borne instruments. The exercise is performed for the long-lived stratospheric trace gases O3 and H2O. Monthly sample biases for O3 exceed 10% for many instruments in the high latitude stratosphere and in the upper troposphere/lower stratosphere, while annual mean sampling biases reach values of up to 20% in the same regions for some instruments. Sampling biases for H2O are generally smaller than for O3, although still notable in the upper troposphere/lower stratosphere and Southern Hemisphere high latitudes. The most important mechanism leading to monthly sampling bias is the non-uniform temporal sampling of many instruments, i.e., the fact that for many instruments, monthly means are produced from measurements which span less than the full month in question. Similarly, annual mean sampling biases are well explained by non-uniformity in the month-to-month sampling by different instruments. Non-uniform sampling in latitude and longitude are shown to also lead to non-negligible sampling biases, which are most relevant for climatologies which are otherwise free of sampling biases due to non-uniform temporal sampling

    Optimizing quantum gates towards the scale of logical qubits

    Full text link
    A foundational assumption of quantum error correction theory is that quantum gates can be scaled to large processors without exceeding the error-threshold for fault tolerance. Two major challenges that could become fundamental roadblocks are manufacturing high performance quantum hardware and engineering a control system that can reach its performance limits. The control challenge of scaling quantum gates from small to large processors without degrading performance often maps to non-convex, high-constraint, and time-dependent control optimization over an exponentially expanding configuration space. Here we report on a control optimization strategy that can scalably overcome the complexity of such problems. We demonstrate it by choreographing the frequency trajectories of 68 frequency-tunable superconducting qubits to execute single- and two-qubit gates while mitigating computational errors. When combined with a comprehensive model of physical errors across our processor, the strategy suppresses physical error rates by ∌3.7×\sim3.7\times compared with the case of no optimization. Furthermore, it is projected to achieve a similar performance advantage on a distance-23 surface code logical qubit with 1057 physical qubits. Our control optimization strategy solves a generic scaling challenge in a way that can be adapted to other quantum algorithms, operations, and computing architectures

    Challenges in physician supply planning: the case of Belgium

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>Planning human resources for health (HRH) is a complex process for policy-makers and, as a result, many countries worldwide swing from surplus to shortage. In-depth case studies can help appraising the challenges encountered and the solutions implemented. This paper has two objectives: to identify the key challenges in HRH planning in Belgium and to formulate recommendations for an effective HRH planning, on the basis of the Belgian case study and lessons drawn from an international benchmarking.</p> <p>Case description</p> <p>In Belgium, a numerus clausus set up in 1997 and effective in 2004, aims to limit the total number of physicians working in the curative sector. The assumption of a positive relationship between physician densities and health care utilization was a major argument in favor of medical supply restrictions. This new regulation did not improve recurrent challenges such as specialty imbalances, with uncovered needs particularly among general practitioners, and geographical maldistribution. New difficulties also emerged. In particular, limiting national training of HRH turned out to be ineffective within the open European workforce market. The lack of integration of policies affecting HRH was noteworthy. We described in the paper what strategies were developed to address those challenges in Belgium and in neighboring countries.</p> <p>Discussion and evaluation</p> <p>Planning the medical workforce involves determining the numbers, mix, and distribution of health providers that will be required at some identified future point in time. To succeed in their task, health policy planners have to take a broader perspective on the healthcare system. Focusing on numbers is too restrictive and adopting innovative policies learned from benchmarking without integration and coordination is unfruitful. Evolving towards a strategic planning is essential to control the effects of the complex factors impacting on human resources. This evolution requires an effective monitoring of all key factors affecting supply and demand, a dynamic approach, and a system-level perspective, considering all healthcare professionals, and integrating manpower planning with workforce development.</p> <p>Conclusion</p> <p>To engage in an evidence-based action, policy-makers need a global manpower picture, from their own country and abroad, as well as reliable and comparable manpower databases allowing proper analysis and planning of the workforce.</p

    Insights into the pathogenesis of vein graft disease: lessons from intravascular ultrasound

    Get PDF
    The success of coronary artery bypass grafting (CABG) is limited by poor long-term graft patency. Saphenous vein is used in the vast majority of CABG operations, although 15% are occluded at one year with as many as 50% occluded at 10 years due to progressive graft atherosclerosis. Intravascular ultrasound (IVUS) has greatly increased our understanding of this process. IVUS studies have shown that early wall thickening and adaptive remodeling of vein grafts occurs within the first few weeks post implantation, with these changes stabilising in angiographically normal vein grafts after six months. Early changes predispose to later atherosclerosis with occlusive plaque detectable in vein grafts within the first year. Both expansive and constrictive remodelling is present in diseased vein grafts, where the latter contributes significantly to occlusive disease. These findings correlate closely with experimental and clinicopathological studies and help define the windows for prevention, intervention or plaque stabilisation strategies. IVUS is also the natural tool for evaluating the effectiveness of pharmacological and other treatments that may prevent or slow the progression of vein graft disease in clinical trials

    Could increased axial wall stress be responsible for the development of atheroma in the proximal segment of myocardial bridges?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A recent model describing the mechanical interaction between a stenosis and the vessel wall has shown that axial wall stress can considerably increase in the region immediately proximal to the stenosis during the (forward) flow phases, so that abnormal biological processes and wall damages are likely to be induced in that region. Our objective was to examine what this model predicts when applied to myocardial bridges.</p> <p>Method</p> <p>The model was adapted to the hemodynamic particularities of myocardial bridges and used to estimate by means of a numerical example the cyclic increase in axial wall stress in the vessel segment proximal to the bridge. The consistence of the results with reported observations on the presence of atheroma in the proximal, tunneled, and distal vessel segments of bridged coronary arteries was also examined.</p> <p>Results</p> <p>1) Axial wall stress can markedly increase in the entrance region of the bridge during the cardiac cycle. 2) This is consistent with reported observations showing that this region is particularly prone to atherosclerosis.</p> <p>Conclusion</p> <p>The proposed mechanical explanation of atherosclerosis in bridged coronary arteries indicates that angioplasty and other similar interventions will not stop the development of atherosclerosis at the bridge entrance and in the proximal epicardial segment if the decrease of the lumen of the tunneled segment during systole is not considerably reduced.</p
    • 

    corecore