306 research outputs found

    Self-Mixing Laser Distance-Sensor Enhanced by Multiple Modulation Waveforms

    Get PDF
    Optical rangefinders based on Self-Mixing Interferometry are widely described in literature, but not yet on the market as commercial instruments. The main reason is that it is relatively easy to propose new elaboration techniques and get results in controlled conditions, while it is very difficult to develop a reliable instrument. In this paper, we propose a laser distance sensor with improved reliability, realized through a wavelength modulation at a different frequency, able to decorrelate single measurement errors and obtain improvement by averages. A dedicated software is implemented to automatically calculate the modulation pre-emphasis, needed to linearize the wavelength modulation. Finally, data selection algorithms allow to overcome signal fading problems due to the speckle effect. A prototype demonstrates the approach with about 0.1 mm accuracy up to 2 m of distance at 200 measurements per second

    Noise Decrease in a Balanced Self-Mixing Interferometer: Theory and Experiments

    Get PDF
    In a self-mixing interferometer built around a laser diode, the signals at the outputs of the two mirrors are in phase opposition, whereas noise fluctuations are partially correlated. Thus, on making the difference between the two outputs, the useful signal is doubled in amplitude and the signal-to-noise ratio is even more enhanced. Through a second-quantization model, the improvement is theoretically predicted to be dependent on laser facets reflectivity. The results are then validated by experimental measurements with different laser types that show very good agreement with theoretical results. The new technique is applicable to a number of already existent self-mixing sensors, potentially improving significantly their measurement performances

    Impact of sterile neutrinos on the early time flux from a galactic supernova

    Get PDF
    FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO - FAPESPWe study the impact of the existence of an eV-mass scale sterile neutrino-with parameters in the ballpark of what is required to fit the laboratory anomalies-on the early time profile of the electron neutrino and antineutrino fluxes associated to a core-collapse supernova (SN). In particular, we focus on the universal feature of neutronization burst expected in the first tens of ms of the signal: Provided that a detector with sufficient sensitivity is available, it is well known that in the three-neutrino framework the detection of the neutronization burst in neutrino channel would signal inverted mass hierarchy. This conclusion is dramatically altered in the presence of a sterile neutrino: We study here, both analytically and numerically, the region in parameter space where this characteristic signal disappears, mimicking normal hierarchy expectations. Conversely, the detection of a peak consistent with expectations for inverted mass hierarchy would exclude the existence of a sterile state over a much wider parameter space than what is required by laboratory anomaly fits, or is even probed by detectors coming on-line in the near future. Additionally, we show the peculiar alteration in the energy-time double differential flux, with a delayed peak appearing for kinematical reasons, which might offer a remarkable signature in the case of favorable parameters and for a high statistics detection of a Galactic SN. We also comment on additional potentially interesting effects in the electron antineutrino channel, if more than one angle in the active-sterile sector is nonvanishing. As an ancillary result that we derived in the technical resolution of the equations, in an Appendix we report the Cayley-Hamilton formalism for the evolution of a four-neutrino system in matter, generalizing existing results in the literature.903120FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO - FAPESPFUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO - FAPESP2009/17924-52012/08208-72012/16389-1O. L. G. P. thanks the ICTP and the financial support from Grant No. 2012/16389-1, Sao Paulo Research Foundation (FAPESP). A. E. thanks the financial support from Grant No. 2009/17924-5, Sao Paulo Research Foundation and from the funding grant Jovem Pesquisador from FAEPEX/UNICAMP. P. S. would like to thank the Instituto de Fisica Gleb Wataghin at UNICAMP for hospitality during the initial stages of this work and financial support from the funding Grant No. 2012/08208-7, Sao Paulo Research Foundation. At LAPTh, this activity was developed coherently with the research axes supported by the Labex grant ENIGMASS. We thank A. Mirizzi for useful comments on the manuscript

    Magnetron sputtering technique for analyzing the influence of RF sputtering power on microstructural surface morphology of aluminum thin films deposited on SiO2/Si substrates

    Get PDF
    In this research, aluminum (Al) thin films were deposited on SiO2/Si substrates using RF magnetron sputtering technique for analyzing the influence of RF sputtering power on microstructural surface morphologies. Different sputtering RF powers (100–400 W) were employed to form Al thin films. The characteristics of deposited Al thin films are investigated using X-ray diffraction pattern (XRD), scanning electron microscopy (SEM), atomic force microscopy (AFM) and Fourier-transforms infrared (FTIR) spectroscopy. The X-ray diffraction (XRD) results demonstrate that the deposited films in low sputtering power have amorphous nature. By increasing the sputtering power, crystallization is observed. AFM analysis results show that the RF power of 300 W is the optimum sputtering power to grow the smoothest Al thin films. FTIR results show that the varying RF power affect the chemical structure of the deposited films. The SEM results show that by increasing the sputtering power leads to the formation of isolated texture on the surface of substrate. In conclusion, RF power has a significant impact on the properties of deposited films, particularly crystallization and shape

    Self-scheduling approach to coordinating wind power producers with energy storage and demand response

    Get PDF
    The uncertainty of wind energy makes wind power producers (WPPs) incur profit loss due to balancing costs in electricity markets, a phenomenon that restricts their participation in markets. This paper proposes a stochastic bidding strategy based on virtual power plants (VPPs) to increase the profit of WPPs in short-term electricity markets in coordination with energy storage systems (ESSs) and demand response (DR). To implement the stochastic solution strategy, the Kantorovich method is used for scenario generation and reduction. The opti-mization problem is formulated as a Mixed-Integer Linear Programming (MILP) problem. From testing the proposed method for a Spanish WPP, it is inferred that the proposed method en-hances the profit of the VPP compared to previous models.fi=vertaisarvioitu|en=peerReviewed

    Investigating Risk Factors and Predicting Complications in Deep Brain Stimulation Surgery with Machine Learning Algorithms

    Full text link
    © 2019 Elsevier Inc. Background: Deep brain stimulation (DBS) surgery is an option for patients experiencing medically resistant neurologic symptoms. DBS complications are rare; finding significant predictors requires a large number of surgeries. Machine learning algorithms may be used to effectively predict these outcomes. The aims of this study were to 1) investigate preoperative clinical risk factors and 2) build machine learning models to predict adverse outcomes. Methods: This multicenter registry collected clinical and demographic characteristics of patients undergoing DBS surgery (n = 501) and tabulated occurrence of complications. Logistic regression was used to evaluate risk factors. Supervised learning algorithms were trained and validated on 70% and 30%, respectively, of both oversampled and original registry data. Performance was evaluated using area under the receiver operating characteristics curve (AUC), sensitivity, specificity, and accuracy. Results: Logistic regression showed that the risk of complication was related to the operating institution in which the surgery was performed (odds ratio [OR] = 0.44, confidence interval [CI] = 0.25–0.78), body mass index (OR = 0.94, CI = 0.89–0.99), and diabetes (OR = 2.33, CI = 1.18–4.60). Patients with diabetes were almost 3× more likely to return to the operating room (OR = 2.78, CI = 1.31–5.88). Patients with a history of smoking were 4× more likely to experience postoperative infection (OR = 4.20, CI = 1.21–14.61). Supervised learning algorithms demonstrated high discrimination performance when predicting any complication (AUC = 0.86), a complication within 12 months (AUC = 0.91), return to the operating room (AUC = 0.88), and infection (AUC = 0.97). Age, body mass index, procedure side, gender, and a diagnosis of Parkinson disease were influential features. Conclusions: Multiple significant complication risk factors were identified, and supervised learning algorithms effectively predicted adverse outcomes in DBS surgery

    The philosophical analysis of technology and its relation to cyberspace

    Get PDF
    Background and Objective:Technology is a widespread and vital event in the present time. The digital form and the telecommunications sector, which have been accompanied by the transformation of social structures, have transformed the daily life more than the biological arenas and the ecology. The essence of technology is not neutral to be passive in the face of human affairs such as culture, but between technology and the production of thought has always been a two-way relationship. Technology has its own knowledge and is related to the philosophical infrastructure of time and it changes with them. This type of knowledge is based on performance and does not have a metaphysical nature. One of the technological events is the realization of a cybernetic space that is closest to the nature of technology. This proportion is due to the technological nature of cyberspace, the new human habitat in the form of transhumanism and the desire to circulate information rhizome, multidimensional and decentralized. The purpose of this research is the philosophical analysis of technology and to find its relation with cyberspace that is done analytically. Methods: The type of research is applied and the research method is analytical-documentary. Findings: Research findings show that what brings an object to the technological level is its structure and rationality, which both meet biological needs and generate new needs for humans; and progresses in a dialectical process to find its relation to the production of thought. Also, the study of the relationship between thought and technology shows that the Gestalt attitude towards technology is related to the relationship between man and technology. This ratio is explained by the specific knowledge of technology based on its conceptual basis in postmodern knowledge. The knowledge that is flexible, relative, and fragile and does not fit into the rational methods of modernism is present in cyberspace, which is the most pervasive form of technological events and, above all, reflects the nature of technology, in the form of pretense, two-tier, and network development. In the section related to human relationship with technology, it was discussed that human beings as creators and users of technology, when faced with a non-technological position, face challenges in the field of ethics, meaning-seeking, transcendence, type and extent. They face responsibility and understand the true relationship with their body and use all their material and spiritual resources to respond to them. Finally, in an ideological context that depicts man's confrontation with technology, cyberspace emerges as an expressive and broad form of technology; an atmosphere that clearly reflects the nature of technology and is a platform for the intersection of technology and society. Research findings show that the abundant entanglement of humans with technology indicates the dominance of the technical object. This dominance is so great that it serves the idea of ​​absolute interconnectedness of the world to more accurately describe the functional life of postmodern man, who has cybernetic imaginary structures and suffers from narcissistic hallucinations as a result of technicality, cyber-extremes and detachment from physical incarnation. Conclusion: The results indicate that the dominance of the technological object and its effects in cyberspace has led to a decrease in responsibility, conflict with the category of meaning, transcendence and moral encounters.   ===================================================================================== COPYRIGHTS  ©2019 The author(s). This is an open access article distributed under the terms of the Creative Commons Attribution (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, as long as the original authors and source are cited. No permission is required from the authors or the publishers.  ====================================================================================

    Neutrino Decays over Cosmological Distances and the Implications for Neutrino Telescopes

    Full text link
    We discuss decays of ultra-relativistic neutrinos over cosmological distances by solving the decay equation in terms of its redshift dependence. We demonstrate that there are significant conceptual differences compared to more simplified treatments of neutrino decay. For instance, the maximum distance the neutrinos have traveled is limited by the Hubble length, which means that the common belief that longer neutrino lifetimes can be probed by longer distances does not apply. As a consequence, the neutrino lifetime limit from supernova 1987A cannot be exceeded by high-energy astrophysical neutrinos. We discuss the implications for neutrino spectra and flavor ratios from gamma-ray bursts as one example of extragalactic sources, using up-to-date neutrino flux predictions. If the observation of SN 1987A implies that \nu_1 is stable and the other mass eigenstates decay with rates much smaller than their current bounds, the muon track rate can be substantially suppressed compared to the cascade rate in the region IceCube is most sensitive to. In this scenario, no gamma-ray burst neutrinos may be found using muon tracks even with the full scale experiment, whereas reliable information on high-energy astrophysical sources can only be obtained from cascade measurements. As another consequence, the recently observed two cascade event candidates at PeV energies will not be accompanied by corresponding muon tracks.Comment: 20 pages, 6 figures, 1 table. Matches published versio
    • …
    corecore