867 research outputs found

    Popular edges and dominant matchings = NépszerƱ élek és dominåns pårosítåsok

    Get PDF

    Popular Matchings in Complete Graphs = NépszerƱ pårosítåsok teljes gråfokban

    Get PDF

    Relation between the phenomenological interactions of the algebraic cluster model and the effective two--nucleon forces

    Get PDF
    We determine the phenomenological cluster--cluster interactions of the algebraic model corresponding to the most often used effective two--nucleon forces for the 16^{16}O + α\alpha system.Comment: Latex with Revtex, 1 figure available on reques

    New and simple algorithms for stable flow problems

    Get PDF
    Stable flows generalize the well-known concept of stable matchings to markets in which transactions may involve several agents, forwarding flow from one to another. An instance of the problem consists of a capacitated directed network, in which vertices express their preferences over their incident edges. A network flow is stable if there is no group of vertices that all could benefit from rerouting the flow along a walk. Fleiner established that a stable flow always exists by reducing it to the stable allocation problem. We present an augmenting-path algorithm for computing a stable flow, the first algorithm that achieves polynomial running time for this problem without using stable allocation as a black-box subroutine. We further consider the problem of finding a stable flow such that the flow value on every edge is within a given interval. For this problem, we present an elegant graph transformation and based on this, we devise a simple and fast algorithm, which also can be used to find a solution to the stable marriage problem with forced and forbidden edges. Finally, we study the stable multicommodity flow model introduced by Kir\'{a}ly and Pap. The original model is highly involved and allows for commodity-dependent preference lists at the vertices and commodity-specific edge capacities. We present several graph-based reductions that show equivalence to a significantly simpler model. We further show that it is NP-complete to decide whether an integral solution exists

    VLBI observation of the newly discovered z=5.18 quasar SDSS J0131-0321

    Get PDF
    Few high-redshift, radio-loud quasars are known to date. The extremely luminous, radio-bright quasar, SDSS J013127.34-032100.1 was recently discovered at a redshift of z=5.18z=5.18. We observed the source with high resolution very long baseline interferometry (VLBI) at 1.7 GHz with the European VLBI Network (EVN) and found a single compact radio component. We estimated a lower limit to the brightness temperature of the detected radio component, T_B~10^{11} K. Additionaly, when compared to archival radio data, the source showed significant flux density variation. These two findings are indicative of the blazar nature of the source.Comment: 5 pages, 2 figures. Accepted for publication in MNRAS Letter

    Discovery of the spectroscopic binary nature of the classical Cepheids FN Aql and V1344 Aql

    Get PDF
    We present the analysis of photometric and spectroscopic data of two classical Cepheids, FN Aquilae and V1344 Aquilae. Based on the joint treatment of the new and earlier radial velocity data, both Galactic Cepheids have been found to be a member in a spectroscopic binary system. To match the phases of the earlier radial velocity data correctly with the new ones, we also determined the temporal behaviour of the pulsation period of these Cepheids based on all available photometric data. The O-C graph covering about half century shows slight changes in the pulsation period due to stellar evolution for both Cepheids.Comment: 7 pages, 6 figures, 7 tables, accepted for publishing in the MNRA

    The completeness of electronic medical record data for patients with type 2 diabetes in primary care and its implications for computer modelling of predicted clinical outcomes.

    Get PDF
    Computer models predicting outcomes among patients with Type 2 Diabetes (T2D) can be used as disease management program evaluation tools. The clinical data required as inputs for these models include annually updated measurements such as blood pressure and glycated haemoglobin (HbA1c). These data can be extracted from primary care physician office systems but there are concerns about their completeness. This paper reports on the completeness of general practice records.Background: Computer models predicting outcomes among patients with Type 2 Diabetes (T2D) can be used as disease management program evaluation tools. The clinical data required as inputs for these models can include annually updated measurements such as blood pressure and glycated haemoglobin (HbA1c). These data can be extracted from primary care physician office systems but there are concerns about their completeness. Objectives/methods: This study addressed the completeness of routinely collected data extracted from 12 primary care practices in Australia. Data on annual availability of blood pressure, weight, total cholesterol, HDL-cholesterol and HbA1c values for regular patients were extracted in 2103 and analysed for temporal trends over the period 2000 to 2012. An ordinal logistic regression model was used to evaluate associations between patient characteristics and completeness of their records. Primary care practitioners were surveyed to identify barriers to recording data and strategies to improve its completeness. Results: Over the study period completeness of data improved substantially from less than 20% for some parameters up to a level of approximately 80% complete, except for the recording of weight. T2D patients with Ischaemic Heart Disease were more likely to have their blood pressure recorded (OR 1.6, p=0.02). Practitioners’ responses suggest they were not experiencing any major barriers to using their electronic medical record system but did agree with some suggested strategies to improve record completeness. Conclusion: The completeness of routinely collected data suitable for input into computerised predictive models is improving although other dimensions of data quality need to be addressed
    • 

    corecore