1,510 research outputs found

    Prediction of Neonatal Respiratory Distress Biomarker Concentration by Application of Machine Learning to Mid-Infrared Spectra

    Get PDF
    The authors of this study developed the use of attenuated total reflectance Fourier transform infrared spectroscopy (ATR–FTIR) combined with machine learning as a point-of-care (POC) diagnostic platform, considering neonatal respiratory distress syndrome (nRDS), for which no POC currently exists, as an example. nRDS can be diagnosed by a ratio of less than 2.2 of two nRDS biomarkers, lecithin and sphingomyelin (L/S ratio), and in this study, ATR–FTIR spectra were recorded from L/S ratios of between 1.0 and 3.4, which were generated using purified reagents. The calibration of principal component (PCR) and partial least squares (PLSR) regression models was performed using 155 raw baselined and second derivative spectra prior to predicting the concentration of a further 104 spectra. A three-factor PLSR model of second derivative spectra best predicted L/S ratios across the full range (R2: 0.967; MSE: 0.014). The L/S ratios from 1.0 to 3.4 were predicted with a prediction interval of +0.29, −0.37 when using a second derivative spectra PLSR model and had a mean prediction interval of +0.26, −0.34 around the L/S 2.2 region. These results support the validity of combining ATR–FTIR with machine learning to develop a point-of-care device for detecting and quantifying any biomarker with an interpretable mid-infrared spectrum

    Testing Leggett's Inequality Using Aharonov-Casher Effect

    Get PDF
    Bell's inequality is established based on local realism. The violation of Bell's inequality by quantum mechanics implies either locality or realism or both are untenable. Leggett's inequality is derived based on nonlocal realism. The violation of Leggett's inequality implies that quantum mechanics is neither local realistic nor nonlocal realistic. The incompatibility of nonlocal realism and quantum mechanics has been urrently confirmed by photon experiments. In our work, we propose to test Leggett's inequality using the Aharonov-Casher effect. In our scheme, four entangled particles emitted from two sources manifest a two-qubit-typed correlation that may result in the violation of the Leggett inequality, while satisfying the no-signaling condition for spacelike separation. Our scheme is tolerant to some local inaccuracies due to the topological nature of the Aharonov-Casher phase. The experimental implementation of our scheme can be possibly realized by a calcium atomic polarization interferometer experiment.Comment: 7 pages, 2 figures. Accepted by Scientific Report

    An experimental test of non-local realism

    Full text link
    Most working scientists hold fast to the concept of 'realism' - a viewpoint according to which an external reality exists independent of observation. But quantum physics has shattered some of our cornerstone beliefs. According to Bell's theorem, any theory that is based on the joint assumption of realism and locality (meaning that local events cannot be affected by actions in space-like separated regions) is at variance with certain quantum predictions. Experiments with entangled pairs of particles have amply confirmed these quantum predictions, thus rendering local realistic theories untenable. Maintaining realism as a fundamental concept would therefore necessitate the introduction of 'spooky' actions that defy locality. Here we show by both theory and experiment that a broad and rather reasonable class of such non-local realistic theories is incompatible with experimentally observable quantum correlations. In the experiment, we measure previously untested correlations between two entangled photons, and show that these correlations violate an inequality proposed by Leggett for non-local realistic theories. Our result suggests that giving up the concept of locality is not sufficient to be consistent with quantum experiments, unless certain intuitive features of realism are abandoned.Comment: Minor corrections to the manuscript, the final inequality and all its conclusions do not change; description of corrections (Corrigendum) added as new Appendix III; Appendix II replaced by a shorter derivatio

    Standards and Practices for Forecasting

    Get PDF
    One hundred and thirty-nine principles are used to summarize knowledge about forecasting. They cover formulating a problem, obtaining information about it, selecting and applying methods, evaluating methods, and using forecasts. Each principle is described along with its purpose, the conditions under which it is relevant, and the strength and sources of evidence. A checklist of principles is provided to assist in auditing the forecasting process. An audit can help one to find ways to improve the forecasting process and to avoid legal liability for poor forecasting

    Conclusive quantum steering with superconducting transition edge sensors

    Get PDF
    Quantum steering allows two parties to verify shared entanglement even if one measurement device is untrusted. A conclusive demonstration of steering through the violation of a steering inequality is of considerable fundamental interest and opens up applications in quantum communication. To date all experimental tests with single photon states have relied on post-selection, allowing untrusted devices to cheat by hiding unfavourable events in losses. Here we close this "detection loophole" by combining a highly efficient source of entangled photon pairs with superconducting transition edge sensors. We achieve an unprecedented ~62% conditional detection efficiency of entangled photons and violate a steering inequality with the minimal number of measurement settings by 48 standard deviations. Our results provide a clear path to practical applications of steering and to a photonic loophole-free Bell test.Comment: Preprint of 7 pages, 3 figures; the definitive version is published in Nature Communications, see below. Also, see related experimental work by A. J. Bennet et al., arXiv:1111.0739 and B. Wittmann et al., arXiv:1111.076

    Experimental loophole-free violation of a Bell inequality using entangled electron spins separated by 1.3 km

    Get PDF
    For more than 80 years, the counterintuitive predictions of quantum theory have stimulated debate about the nature of reality. In his seminal work, John Bell proved that no theory of nature that obeys locality and realism can reproduce all the predictions of quantum theory. Bell showed that in any local realist theory the correlations between distant measurements satisfy an inequality and, moreover, that this inequality can be violated according to quantum theory. This provided a recipe for experimental tests of the fundamental principles underlying the laws of nature. In the past decades, numerous ingenious Bell inequality tests have been reported. However, because of experimental limitations, all experiments to date required additional assumptions to obtain a contradiction with local realism, resulting in loopholes. Here we report on a Bell experiment that is free of any such additional assumption and thus directly tests the principles underlying Bell's inequality. We employ an event-ready scheme that enables the generation of high-fidelity entanglement between distant electron spins. Efficient spin readout avoids the fair sampling assumption (detection loophole), while the use of fast random basis selection and readout combined with a spatial separation of 1.3 km ensure the required locality conditions. We perform 245 trials testing the CHSH-Bell inequality S2S \leq 2 and find S=2.42±0.20S = 2.42 \pm 0.20. A null hypothesis test yields a probability of p=0.039p = 0.039 that a local-realist model for space-like separated sites produces data with a violation at least as large as observed, even when allowing for memory in the devices. This result rules out large classes of local realist theories, and paves the way for implementing device-independent quantum-secure communication and randomness certification.Comment: Raw data will be made available after publicatio

    In vivo rate-determining steps of tau seed accumulation in Alzheimer's disease

    Get PDF
    Both the replication of protein aggregates and their spreading throughout the brain are implicated in the progression of Alzheimer’s disease (AD). However, the rates of these processes are unknown and the identity of the rate-determining process in humans has therefore remained elusive. By bringing together chemical kinetics with measurements of tau seeds and aggregates across brain regions, we can quantify their replication rate in human brains. Notably, we obtain comparable rates in several different datasets, with five different methods of tau quantification, from postmortem seed amplification assays to tau PET studies in living individuals. Our results suggest that from Braak stage III onward, local replication, rather than spreading between brain regions, is the main process controlling the overall rate of accumulation of tau in neocortical regions. The number of seeds doubles only every ∼5 years. Thus, limiting local replication likely constitutes the most promising strategy to control tau accumulation during AD

    Selecting Forecasting Methods

    Get PDF
    I examined six ways of selecting forecasting methods: Convenience, “what’s easy,” is inexpensive, but risky. Market popularity, “what others do,” sounds appealing but is unlikely to be of value because popularity and success may not be related and because it overlooks some methods. Structured judgment, “what experts advise,” which is to rate methods against prespecified criteria, is promising. Statistical criteria, “what should work,” are widely used and valuable, but risky if applied narrowly. Relative track records, “what has worked in this situation,” are expensive because they depend on conducting evaluation studies. Guidelines from prior research, “what works in this type of situation,” relies on published research and offers a low-cost, effective approach to selection. Using a systematic review of prior research, I developed a flow chart to guide forecasters in selecting among ten forecasting methods. Some key findings: Given enough data, quantitative methods are more accurate than judgmental methods. When large changes are expected, causal methods are more accurate than naive methods. Simple methods are preferable to complex methods; they are easier to understand, less expensive, and seldom less accurate. To select a judgmental method, determine whether there are large changes, frequent forecasts, conflicts among decision makers, and policy considerations. To select a quantitative method, consider the level of knowledge about relationships, the amount of change involved, the type of data, the need for policy analysis, and the extent of domain knowledge. When selection is difficult, combine forecasts from different methods

    Life cycle assessment of European anchovy (Engraulis encrasicolus) landed by purse seine vessels in northern Spain

    Get PDF
    Purpose: The main purpose of this article is to assess the environmental impacts associated with the fishing operations related to European anchovy fishing in Cantabria (northern Spain) under a life cycle approach. Methods: The life cycle assessment (LCA) methodology was applied for this case study including construction, maintenance, use, and end of life of the vessels. The functional unit used was 1 kg of landed round anchovy at port. Inventory data were collected for the main inputs and outputs of 32 vessels, representing a majority of vessels in the fleet. Results and discussion: Results indicated, in a similar line to what is reported in the literature, that the production, transportation, and use of diesel were the main environmental hot spots in conventional impact categories. Moreover, in this case, the production and transportation of seine nets was also relevant. Impacts linked to greenhouse gas (GHG) emissions suggest that emissions were in the upper range for fishing species captured with seine nets and the value of global warming potential (GWP) was 1.44 kg CO2 eq per functional unit. The ecotoxicity impacts were mainly due to the emissions of antifouling substances to the ocean. Regarding fishery-specific categories, many were discarded given the lack of detailed stock assessments for this fishery. Hence, only the biotic resource use category was computed, demonstrating that the ecosystems' effort to sustain the fishery is relatively low. Conclusions: The use of the LCA methodology allowed identifying the main environmental hot spots of the purse seining fleet targeting European anchovy in Cantabria. Individualized results per port or per vessel suggested that there are significant differences in GHG emissions between groups. In addition, fuel use is high when compared to similar fisheries. Therefore, research needs to be undertaken to identify why fuel use is so high, particularly if it is related to biomass and fisheries management or if skipper decisions could play a role.The authors thank the Ministry of Economy and Competitiveness of the Spanish Government for their financial support via the project GeSAC-Conserva: Sustainable Management of the Cantabrian Anchovies (CTM2013-43539-R) and to Pedro Villanueva-Rey for valuable scientific exchange. Jara Laso thanks the Ministry of Economy and Competitiveness of Spanish Government for their financial support via the research fellowship BES-2014-069368 and to the Ministry of Rural Environment, Fisheries and Food of Cantabria for the data support. Dr. Ian Vázquez-Rowe thanks the Peruvian LCA Network for operational support. Reviewers are also thanked for the valuable and detailed suggestions. The work of Dr. Rosa M. Crujeiras has been funded by MTM2016-76969P (AEI/FEDER, UE)
    corecore