323 research outputs found

    Reduction of time-resolved space-based CCD photometry developed for MOST Fabry Imaging data

    Full text link
    The MOST (Microvariability & Oscillations of STars) satellite obtains ultraprecise photometry from space with high sampling rates and duty cycles. Astronomical photometry or imaging missions in low Earth orbits, like MOST, are especially sensitive to scattered light from Earthshine, and all these missions have a common need to extract target information from voluminous data cubes. They consist of upwards of hundreds of thousands of two-dimensional CCD frames (or sub-rasters) containing from hundreds to millions of pixels each, where the target information, superposed on background and instrumental effects, is contained only in a subset of pixels (Fabry Images, defocussed images, mini-spectra). We describe a novel reduction technique for such data cubes: resolving linear correlations of target and background pixel intensities. This stepwise multiple linear regression removes only those target variations which are also detected in the background. The advantage of regression analysis versus background subtraction is the appropriate scaling, taking into account that the amount of contamination may differ from pixel to pixel. The multivariate solution for all pairs of target/background pixels is minimally invasive of the raw photometry while being very effective in reducing contamination due to, e.g., stray light. The technique is tested and demonstrated with both simulated oscillation signals and real MOST photometry.Comment: 16 pages, 23 figure

    Data-Driven Process Discovery: A Discrete Time Algebra for Relational Signal Analysis

    Get PDF
    This research presents an autonomous and computationally tractable method for scientific process analysis, combining an iterative algorithmic search and a recognition technique to discover multivariate linear and non-linear relations within experimental data series. These resultant data-driven relations provide researchers with a potentially real-time insight into experimental process phenomena and behavior. This method enables the efficient search of a potentially infinite space of relations within large data series to identify relations that accurately represent process phenomena. Proposed is a time series transformation that encodes and compresses real-valued data into a well-defined, discrete-space of 13 primitive elements where comparative evaluation between variables is both plausible and heuristically efficient. Additionally, this research develops and demonstrates binary discrete-space operations which accurately parallel their numeric-space equivalents. These operations extend the method\u27s utility into trivariate relational analysis, and experimental evidence is offered supporting the existence of traceable multivariate signatures of incremental order within the discrete-space that can be exploited for higher dimensional analysis by means of an iterative best-n first search

    Protection Through Participation: Crowdsourced Tap Water Quality Monitoring for Enhanced Public Health

    Get PDF
    Lead contamination in municipal drinking water is a national public health issue and is generally the result of water contact with leaded distribution piping and on-premise plumbing. As a result, the US Environmental Protection Agency’s Lead and Copper Rule requires point of use sampling methods at a small fraction of consumer taps on the public water distribution system. While this approach is practical, it leaves large gaps of consumers without direct monitoring and protection. In response, a novel contest-based crowdsourcing study was conducted to engage the public in monitoring their own water quality at their home taps and study factors that shaped participation in drinking water monitoring. Participants were asked to collect samples of their household drinking water through social media postings, kiosks, and community events with the chance to win a cash prize. The project distributed approximately 800 sampling packets and received 147 packets from participants of which 93% had at least partially completed surveys. Part I of this thesis investigated lead levels, participant recruitment and demographic patterns, and motivations for participation. On average, private wells were found to have higher lead levels than the public water supply, and the higher lead levels were not attributed to older building age. There was also no statistical relevance between the participants’ perceived and actual tap water quality. Survey responses indicated that citizens were motivated to participate in the project due to concerns about their own health and/or the health of their families. In contrast, participants reported that they were not motivated by the cash prize. Part II of this thesis investigated the influence of socioeconomic characteristics on participants’ environmental literacy, behavior, and social networks. When looking at actions taken in response to water quality issues, income, age, and educational groups had some of the largest, significant, paired differences. With regards to knowledge, this project showed success in potentially improving citizen’s scientific literacy relating to key lead information, and overall provided self-assessed educational benefits to those who participated. This project helps inform future public engagement with water quality monitoring, create new knowledge about the influence of personal motivations for participation, and provide recommendations to help increase awareness of water quality issues. It also demonstrates that the crowdsourcing method could be used to actively engage and inform citizen participants in water quality monitoring efforts, creating a more scientifically literate and active public

    Weighted Quasi Interpolant Spline Approximations: Properties and Applications

    Get PDF
    Continuous representations are fundamental for modeling sampled data and performing computations and numerical simulations directly on the model or its elements. To effectively and efficiently address the approximation of point clouds we propose the Weighted Quasi Interpolant Spline Approximation method (wQISA). We provide global and local bounds of the method and discuss how it still preserves the shape properties of the classical quasi-interpolation scheme. This approach is particularly useful when the data noise can be represented as a probabilistic distribution: from the point of view of nonparametric regression, the wQISA estimator is robust to random perturbations, such as noise and outliers. Finally, we show the effectiveness of the method with several numerical simulations on real data, including curve fitting on images, surface approximation and simulation of rainfall precipitations

    Long-term neural and physiological phenotyping of a single human

    Get PDF
    Psychiatric disorders are characterized by major fluctuations in psychological function over the course of weeks and months, but the dynamic characteristics of brain function over this timescale in healthy individuals are unknown. Here, as a proof of concept to address this question, we present the MyConnectome project. An intensive phenome-wide assessment of a single human was performed over a period of 18 months, including functional and structural brain connectivity using magnetic resonance imaging, psychological function and physical health, gene expression and metabolomics. A reproducible analysis workflow is provided, along with open access to the data and an online browser for results. We demonstrate dynamic changes in brain connectivity over the timescales of days to months, and relations between brain connectivity, gene expression and metabolites. This resource can serve as a testbed to study the joint dynamics of human brain and metabolic function over time, an approach that is critical for the development of precision medicine strategies for brain disorders

    Development and Testing of a New Optimum Design Code for Hypersonic Wind Tunnel Nozzles, Including Boundary Layer, Turbulence, and Real Gas Effects

    Get PDF
    A robust and efficient optimization code is developed and validated. The code is used to redesign an existing Mach 12 wind tunnel nozzle and utilizes response surface methodology (RSM) techniques. Explicit, globally second-order, flux-difference-splitting algorithms are used to solve the Navier-Stokes (NS) and Parabolized Navier-Stokes (PNS) flow solvers incorporated into the optimizer code. Either the Baldwin-Lomax or the Yang-Shih k-s turbulence model may be employed in the optimization code. First, 2-D/axisymmetric NS and PNS flow solvers are developed/modified and account for perfect gas/nonequilibrium chemically reacting flows. All solvers are validated against Computational Fluid Dynamics (CFD) and experimental data. The optimization code is subsequently developed and validated. The optimization code is then used to optimize the Mach 12 nozzle design and the computed results are compared with those of the original nozzle. The code is tested for robustness and on three separate occasions locates the global minimum synonymous with the `global best\u27 optimized nozzle. Though an optimized nozzle is obtained, it is not as free of disturbances in the uniform inviscid core at the exit as possibly desired
    • …
    corecore