109,030 research outputs found

    Evaluation of touch trigger probe measurement uncertainty using FEA

    Get PDF
    Evaluation of measurement uncertainty is an essential subject in dimensional measurement. It has also become a dominant issue in coordinate measuring machine (CMM) even though its machine performance has been well accepted by many users. CMM probes, especially touch trigger probes which are commonly used, have been acknowledged as a key error source, largely due to pre-travel variations. The probe errors result in large measurement uncertainty in CMM measurement. Various methods have been introduced to estimate measurement uncertainty, but they tend to be time consuming and necessarily require a large amount of experimental data for analyzing the uncertainty. This paper presents the method of evaluation of CMM probe uncertainty using FEA modeling. It is started with the investigation of the behavior of probe by recording stylus displacement with vary triggering force. Then, those displacement results will be analyzed with sensitivity analysis technique to estimate the uncertainty of recorded results

    A Vectoring Thrust Coaxial Rotor for Micro Air Vehicle: Modeling, Design and Analysis

    Get PDF
    The growing interest of rotary wing UAVs, for military and civilian applications, has encouraged designers to consider miniaturized configurations, more efficient in terms of endurance, payload capability and maneuverability. The purpose of this paper is to study a new configuration of coaxial rotor as applied to a micro aerial vehicle (MAV) with the intention to guarantee the vehicle maneuverability while removing unnecessary control surfaces which would increase wind gust sensitivity. Coaxial rotor configurations maximize the available rotor disk surface and allow for torque cancelation. Tilting rotors may allow for the vehicle control

    The Borexino Thermal Monitoring & Management System and simulations of the fluid-dynamics of the Borexino detector under asymmetrical, changing boundary conditions

    Full text link
    A comprehensive monitoring system for the thermal environment inside the Borexino neutrino detector was developed and installed in order to reduce uncertainties in determining temperatures throughout the detector. A complementary thermal management system limits undesirable thermal couplings between the environment and Borexino's active sections. This strategy is bringing improved radioactive background conditions to the region of interest for the physics signal thanks to reduced fluid mixing induced in the liquid scintillator. Although fluid-dynamical equilibrium has not yet been fully reached, and thermal fine-tuning is possible, the system has proven extremely effective at stabilizing the detector's thermal conditions while offering precise insights into its mechanisms of internal thermal transport. Furthermore, a Computational Fluid-Dynamics analysis has been performed, based on the empirical measurements provided by the thermal monitoring system, and providing information into present and future thermal trends. A two-dimensional modeling approach was implemented in order to achieve a proper understanding of the thermal and fluid-dynamics in Borexino. It was optimized for different regions and periods of interest, focusing on the most critical effects that were identified as influencing background concentrations. Literature experimental case studies were reproduced to benchmark the method and settings, and a Borexino-specific benchmark was implemented in order to validate the modeling approach for thermal transport. Finally, fully-convective models were applied to understand general and specific fluid motions impacting the detector's Active Volume.Comment: arXiv admin note: substantial text overlap with arXiv:1705.09078, arXiv:1705.0965

    COSMOGRAIL: XVII. Time delays for the quadruply imaged quasar PG 1115+080

    Get PDF
    Indexación: Scopus.Acknowledgements. The authors would like to thank R. Gredel for his help in setting up the program at the ESO MPIA 2.2 m telescope, and the anonymous referee for his or her comments on this work. This work is supported by the Swiss National Fundation. This research made use of Astropy, a community-developed core Python package for Astronomy (Astropy Collaboration et al. 2013, 2018) and the 2D graphics environment Matplotlib (Hunter 2007). K.R. acknowledge support from PhD fellowship FIB-UV 2015/2016 and Becas de Doctorado Nacional CONICYT 2017 and thanks the LSSTC Data Science Fellowship Program, her time as a Fellow has benefited this work. M.T. acknowledges support by the DFG grant Hi 1495/2-1. G. C.-F. C. acknowledges support from the Ministry of Education in Taiwan via Government Scholarship to Study Abroad (GSSA). D. C.-Y. Chao and S. H. Suyu gratefully acknowledge the support from the Max Planck Society through the Max Planck Research Group for S. H. Suyu. T. A. acknowledges support by the Ministry for the Economy, Development, and Tourism’s Programa Inicativa Científica Milenio through grant IC 12009, awarded to The Millennium Institute of Astrophysics (MAS).We present time-delay estimates for the quadruply imaged quasar PG 1115+080. Our results are based on almost daily observations for seven months at the ESO MPIA 2.2 m telescope at La Silla Observatory, reaching a signal-to-noise ratio of about 1000 per quasar image. In addition, we re-analyze existing light curves from the literature that we complete with an additional three seasons of monitoring with the Mercator telescope at La Palma Observatory. When exploring the possible source of bias we considered the so-called microlensing time delay, a potential source of systematic error so far never directly accounted for in previous time-delay publications. In 15 yr of data on PG 1115+080, we find no strong evidence of microlensing time delay. Therefore not accounting for this effect, our time-delay estimates on the individual data sets are in good agreement with each other and with the literature. Combining the data sets, we obtain the most precise time-delay estimates to date on PG 1115+080, with Δt(AB) = 8.3+1.5 -1.6 days (18.7% precision), Δt(AC) = 9.9+1.1 -1.1 days (11.1%) and Δt(BC) = 18.8+1.6 -1.6 days (8.5%). Turning these time delays into cosmological constraints is done in a companion paper that makes use of ground-based Adaptive Optics (AO) with the Keck telescope. © ESO 2018.https://www.aanda.org/articles/aa/abs/2018/08/aa33287-18/aa33287-18.htm

    Anti-alignments in conformance checking: the dark side of process models

    Get PDF
    Conformance checking techniques asses the suitability of a process model in representing an underlying process, observed through a collection of real executions. These techniques suffer from the wellknown state space explosion problem, hence handling process models exhibiting large or even infinite state spaces remains a challenge. One important metric in conformance checking is to asses the precision of the model with respect to the observed executions, i.e., characterize the ability of the model to produce behavior unrelated to the one observed. By avoiding the computation of the full state space of a model, current techniques only provide estimations of the precision metric, which in some situations tend to be very optimistic, thus hiding real problems a process model may have. In this paper we present the notion of antialignment as a concept to help unveiling traces in the model that may deviate significantly from the observed behavior. Using anti-alignments, current estimations can be improved, e.g., in precision checking. We show how to express the problem of finding anti-alignments as the satisfiability of a Boolean formula, and provide a tool which can deal with large models efficiently.Peer ReviewedPostprint (author's final draft

    Optimal measurement of visual motion across spatial and temporal scales

    Full text link
    Sensory systems use limited resources to mediate the perception of a great variety of objects and events. Here a normative framework is presented for exploring how the problem of efficient allocation of resources can be solved in visual perception. Starting with a basic property of every measurement, captured by Gabor's uncertainty relation about the location and frequency content of signals, prescriptions are developed for optimal allocation of sensors for reliable perception of visual motion. This study reveals that a large-scale characteristic of human vision (the spatiotemporal contrast sensitivity function) is similar to the optimal prescription, and it suggests that some previously puzzling phenomena of visual sensitivity, adaptation, and perceptual organization have simple principled explanations.Comment: 28 pages, 10 figures, 2 appendices; in press in Favorskaya MN and Jain LC (Eds), Computer Vision in Advanced Control Systems using Conventional and Intelligent Paradigms, Intelligent Systems Reference Library, Springer-Verlag, Berli
    • …
    corecore