457 research outputs found

    Sociodemographic Trends in National Ambulatory Care Visits for Hepatitis C Virus Infection

    Get PDF
    Poor and non-white patients are disproportionately infected with the hepatitis C virus (HCV). The objective of this research is to determine sociodemographic patterns of HCV-related ambulatory care visits over time. Data from the National Ambulatory Medical Care Survey (NAMCS) and the National Hospital Ambulatory Medical Care Survey-Outpatient (NHAMCS-OPD) for the years 1997–2005 were analyzed in 3-year intervals. Demographic and other variables were compared for each period, and multivariable logistic regression was performed to examine whether the likelihood of a visit being HCV-related (versus non-HCV) was independently associated with (1) race and/or (2) Medicaid status over time. The total number of HCV-related ambulatory visits more than doubled from 3,583,585 during the years 1997–1999 to 8,027,166 during 2003–2005. During this time, the proportion of non-whites and Medicaid recipients presenting for HCV-related visits approximately doubled (non-whites: 16% vs. 33%, P = 0.04; Medicaid recipients: 10% vs. 25%, P = 0.07). In 2003–2005, HCV-related visits were more than twice as likely to occur among non-white patients vs. white patients (OR = 2.49; 95% CI: 1.60–3.86) and patients on Medicaid vs. non-Medicaid (3.49; 1.79–6.80). Our results show that HCV-associated ambulatory care visits are increasing, with a greater proportion of visits occurring among non-white patients and Medicaid recipients

    Exact Speedup Factors and Sub-Optimality for Non-Preemptive Scheduling

    Get PDF
    Fixed priority scheduling is used in many real-time systems; however, both preemptive and non-preemptive variants (FP-P and FP-NP) are known to be sub-optimal when compared to an optimal uniprocessor scheduling algorithm such as preemptive earliest deadline first (EDF-P). In this paper, we investigate the sub-optimality of fixed priority non-preemptive scheduling. Specifically, we derive the exact processor speed-up factor required to guarantee the feasibility under FP-NP (i.e. schedulability assuming an optimal priority assignment) of any task set that is feasible under EDF-P. As a consequence of this work, we also derive a lower bound on the sub-optimality of non-preemptive EDF (EDF-NP). As this lower bound matches a recently published upper bound for the same quantity, it closes the exact sub-optimality for EDF-NP. It is known that neither preemptive, nor non-preemptive fixed priority scheduling dominates the other, in other words, there are task sets that are feasible on a processor of unit speed under FP-P that are not feasible under FP-NP and vice-versa. Hence comparing these two algorithms, there are non-trivial speedup factors in both directions. We derive the exact speed-up factor required to guarantee the FP-NP feasibility of any FP-P feasible task set. Further, we derive the exact speed-up factor required to guarantee FP-P feasibility of any constrained-deadline FP-NP feasible task set

    The Pioneer Anomaly

    Get PDF
    Radio-metric Doppler tracking data received from the Pioneer 10 and 11 spacecraft from heliocentric distances of 20-70 AU has consistently indicated the presence of a small, anomalous, blue-shifted frequency drift uniformly changing with a rate of ~6 x 10^{-9} Hz/s. Ultimately, the drift was interpreted as a constant sunward deceleration of each particular spacecraft at the level of a_P = (8.74 +/- 1.33) x 10^{-10} m/s^2. This apparent violation of the Newton's gravitational inverse-square law has become known as the Pioneer anomaly; the nature of this anomaly remains unexplained. In this review, we summarize the current knowledge of the physical properties of the anomaly and the conditions that led to its detection and characterization. We review various mechanisms proposed to explain the anomaly and discuss the current state of efforts to determine its nature. A comprehensive new investigation of the anomalous behavior of the two Pioneers has begun recently. The new efforts rely on the much-extended set of radio-metric Doppler data for both spacecraft in conjunction with the newly available complete record of their telemetry files and a large archive of original project documentation. As the new study is yet to report its findings, this review provides the necessary background for the new results to appear in the near future. In particular, we provide a significant amount of information on the design, operations and behavior of the two Pioneers during their entire missions, including descriptions of various data formats and techniques used for their navigation and radio-science data analysis. As most of this information was recovered relatively recently, it was not used in the previous studies of the Pioneer anomaly, but it is critical for the new investigation.Comment: 165 pages, 40 figures, 16 tables; accepted for publication in Living Reviews in Relativit

    Two-dimensional electrophoretic comparison of metastatic and non-metastatic human breast tumors using in vitro cultured epithelial cells derived from the cancer tissues

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Breast carcinomas represent a heterogeneous group of tumors diverse in behavior, outcome, and response to therapy. Identification of proteins resembling the tumor biology can improve the diagnosis, prediction, treatment selection, and targeting of therapy. Since the beginning of the post-genomic era, the focus of molecular biology gradually moved from genomes to proteins and proteomes and to their functionality. Proteomics can potentially capture dynamic changes in protein expression integrating both genetic and epigenetic influences.</p> <p>Methods</p> <p>We prepared primary cultures of epithelial cells from 23 breast cancer tissue samples and performed comparative proteomic analysis. Seven patients developed distant metastases within three-year follow-up. These samples were included into a metastase-positive group, the others formed a metastase-negative group. Two-dimensional electrophoretical (2-DE) gels in pH range 4–7 were prepared. Spot densities in 2-DE protein maps were subjected to statistical analyses (R/maanova package) and data-mining analysis (GUHA). For identification of proteins in selected spots, liquid chromatography-tandem mass spectrometry (LC-MS/MS) was employed.</p> <p>Results</p> <p>Three protein spots were significantly altered between the metastatic and non-metastatic groups. The correlations were proven at the 0.05 significance level. Nucleophosmin was increased in the group with metastases. The levels of 2,3-trans-enoyl-CoA isomerase and glutathione peroxidase 1 were decreased.</p> <p>Conclusion</p> <p>We have performed an extensive proteomic study of mammary epithelial cells from breast cancer patients. We have found differentially expressed proteins between the samples from metastase-positive and metastase-negative patient groups.</p

    Measurement of the cross-section of high transverse momentum vector bosons reconstructed as single jets and studies of jet substructure in pp collisions at √s = 7 TeV with the ATLAS detector

    Get PDF
    This paper presents a measurement of the cross-section for high transverse momentum W and Z bosons produced in pp collisions and decaying to all-hadronic final states. The data used in the analysis were recorded by the ATLAS detector at the CERN Large Hadron Collider at a centre-of-mass energy of √s = 7 TeV;{\rm Te}{\rm V}andcorrespondtoanintegratedluminosityof and correspond to an integrated luminosity of 4.6\;{\rm f}{{{\rm b}}^{-1}}.ThemeasurementisperformedbyreconstructingtheboostedWorZbosonsinsinglejets.ThereconstructedjetmassisusedtoidentifytheWandZbosons,andajetsubstructuremethodbasedonenergyclusterinformationinthejetcentre−of−massframeisusedtosuppressthelargemulti−jetbackground.Thecross−sectionforeventswithahadronicallydecayingWorZboson,withtransversemomentum. The measurement is performed by reconstructing the boosted W or Z bosons in single jets. The reconstructed jet mass is used to identify the W and Z bosons, and a jet substructure method based on energy cluster information in the jet centre-of-mass frame is used to suppress the large multi-jet background. The cross-section for events with a hadronically decaying W or Z boson, with transverse momentum {{p}_{{\rm T}}}\gt 320\;{\rm Ge}{\rm V}andpseudorapidity and pseudorapidity |\eta |\lt 1.9,ismeasuredtobe, is measured to be {{\sigma }_{W+Z}}=8.5\pm 1.7$ pb and is compared to next-to-leading-order calculations. The selected events are further used to study jet grooming techniques

    Search for direct pair production of the top squark in all-hadronic final states in proton-proton collisions at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for direct pair production of the scalar partner to the top quark using an integrated luminosity of 20.1fb−1 of proton–proton collision data at √s = 8 TeV recorded with the ATLAS detector at the LHC are reported. The top squark is assumed to decay via t˜→tχ˜01 or t˜→ bχ˜±1 →bW(∗)χ˜01 , where χ˜01 (χ˜±1 ) denotes the lightest neutralino (chargino) in supersymmetric models. The search targets a fully-hadronic final state in events with four or more jets and large missing transverse momentum. No significant excess over the Standard Model background prediction is observed, and exclusion limits are reported in terms of the top squark and neutralino masses and as a function of the branching fraction of t˜ → tχ˜01 . For a branching fraction of 100%, top squark masses in the range 270–645 GeV are excluded for χ˜01 masses below 30 GeV. For a branching fraction of 50% to either t˜ → tχ˜01 or t˜ → bχ˜±1 , and assuming the χ˜±1 mass to be twice the χ˜01 mass, top squark masses in the range 250–550 GeV are excluded for χ˜01 masses below 60 GeV

    Search for pair-produced long-lived neutral particles decaying to jets in the ATLAS hadronic calorimeter in ppcollisions at √s=8TeV

    Get PDF
    The ATLAS detector at the Large Hadron Collider at CERN is used to search for the decay of a scalar boson to a pair of long-lived particles, neutral under the Standard Model gauge group, in 20.3fb−1of data collected in proton–proton collisions at √s=8TeV. This search is sensitive to long-lived particles that decay to Standard Model particles producing jets at the outer edge of the ATLAS electromagnetic calorimeter or inside the hadronic calorimeter. No significant excess of events is observed. Limits are reported on the product of the scalar boson production cross section times branching ratio into long-lived neutral particles as a function of the proper lifetime of the particles. Limits are reported for boson masses from 100 GeVto 900 GeV, and a long-lived neutral particle mass from 10 GeVto 150 GeV

    Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    Get PDF
    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the common sense hypothesis that the first six hours comprise the period of peak night activity for several species, thereby resulting in a representative sample for the whole night. To this end, we combined re-sampling techniques, species accumulation curves, threshold analysis, and community concordance of species compositional data, and applied them to datasets of three different Neotropical biomes (Amazonia, Atlantic Forest and Cerrado). We show that the strategy of restricting sampling to only six hours of the night frequently results in incomplete sampling representation of the entire bat community investigated. From a quantitative standpoint, results corroborated the existence of a major Sample Area effect in all datasets, although for the Amazonia dataset the six-hour strategy was significantly less species-rich after extrapolation, and for the Cerrado dataset it was more efficient. From the qualitative standpoint, however, results demonstrated that, for all three datasets, the identity of species that are effectively sampled will be inherently impacted by choices of sub-sampling schedule. We also propose an alternative six-hour sampling strategy (at the beginning and the end of a sample night) which performed better when resampling Amazonian and Atlantic Forest datasets on bat assemblages. Given the observed magnitude of our results, we propose that sample representativeness has to be carefully weighed against study objectives, and recommend that the trade-off between logistical constraints and additional sampling performance should be carefully evaluated
    • 

    corecore