50,415 research outputs found
Functional Regression
Functional data analysis (FDA) involves the analysis of data whose ideal
units of observation are functions defined on some continuous domain, and the
observed data consist of a sample of functions taken from some population,
sampled on a discrete grid. Ramsay and Silverman's 1997 textbook sparked the
development of this field, which has accelerated in the past 10 years to become
one of the fastest growing areas of statistics, fueled by the growing number of
applications yielding this type of data. One unique characteristic of FDA is
the need to combine information both across and within functions, which Ramsay
and Silverman called replication and regularization, respectively. This article
will focus on functional regression, the area of FDA that has received the most
attention in applications and methodological development. First will be an
introduction to basis functions, key building blocks for regularization in
functional regression methods, followed by an overview of functional regression
methods, split into three types: [1] functional predictor regression
(scalar-on-function), [2] functional response regression (function-on-scalar)
and [3] function-on-function regression. For each, the role of replication and
regularization will be discussed and the methodological development described
in a roughly chronological manner, at times deviating from the historical
timeline to group together similar methods. The primary focus is on modeling
and methodology, highlighting the modeling structures that have been developed
and the various regularization approaches employed. At the end is a brief
discussion describing potential areas of future development in this field
Galaxy density profiles and shapes -- I. simulation pipeline for lensing by realistic galaxy models
Studies of strong gravitational lensing in current and upcoming wide and deep
photometric surveys, and of stellar kinematics from (integral-field)
spectroscopy at increasing redshifts, promise to provide valuable constraints
on galaxy density profiles and shapes. However, both methods are affected by
various selection and modelling biases, whch we aim to investigate in a
consistent way. In this first paper in a series we develop a flexible but
efficient pipeline to simulate lensing by realistic galaxy models. These galaxy
models have separate stellar and dark matter components, each with a range of
density profiles and shapes representative of early-type, central galaxies
without significant contributions from other nearby galaxies. We use Fourier
methods to calculate the lensing properties of galaxies with arbitrary surface
density distributions, and Monte Carlo methods to compute lensing statistics
such as point-source lensing cross-sections. Incorporating a variety of
magnification bias modes lets us examine different survey limitations in image
resolution and flux. We rigorously test the numerical methods for systematic
errors and sensitivity to basic assumptions. We also determine the minimum
number of viewing angles that must be sampled in order to recover accurate
orientation-averaged lensing quantities. We find that for a range of
non-isothermal stellar and dark matter density profiles typical of elliptical
galaxies, the combined density profile and corresponding lensing properties are
surprisingly close to isothermal around the Einstein radius. The converse
implication is that constraints from strong lensing and/or stellar kinematics,
which are indeed consistent with isothermal models near the Einstein radius,
cannot trivially be extrapolated to smaller and larger radii.Comment: 31 pages, 15 figures; paper II at arXiv:0808.2497; accepted for
publication in MNRAS; PDF file with full resolution figures at
http://www.sns.ias.edu/~glenn/paper1.pd
The role of the deep roots of perennial cereal kernza in a drying climate
Agricultural lands under annual crop production are prone to degradation and as the climate becomes increasingly variable researchers and farmers alike are looking at resilient crops such as perennial grains to produce food regeneratively. Perennial grain crops support a myriad of ecosystem services, such as reducing nitrate leaching, erosion control and increasing carbon storage. With their deep roots, perennial grain crops like Kernza (Intermediate wheatgrass) could furthermore avoid surface stresses such as droughts. This has however not been investigated before. Therefore we set out to determine the depth of root water uptake (RWU) of this crop and compared the contribution of deep roots before and after anthesis and between a year of adequate water supply (2019) and a year of drought (2018). Natural abundances of 2H and 18O were determined, but were unable to be used properly due to mistakes during sampling. A tracer application showed limited uptake from 2m depth. Furthermore, soil water content measurements were used to inverse model the soil hydraulic parameters under the Kernza crop in Hydrus 1D. Modelling RWU showed that the deep roots (>1m) were responsible for almost 50% of the RWU between anthesis and harvest in 2018, whereas they only contributed between 10% and 15% throughout 2019 and most of 2018 outside of the indicated period. Kernza may thus be an important addition to a farmers toolbox in areas with periodic droughts, but only if grain yields are increased to be competitive with annual cereals or when used as a multifunctional crop for grain, forage and other ecosystem services
Probing Plasmodium falciparum sexual commitment at the single-cell level
Background: Malaria parasites go through major transitions during their complex life cycle, yet the underlying differentiation pathways remain obscure. Here we apply single cell transcriptomics to unravel the program inducing sexual differentiation in Plasmodium falciparum. Parasites have to make this essential life-cycle decision in preparation for human-to-mosquito transmission. Methods: By combining transcriptional profiling with quantitative imaging and genetics, we defined a transcriptional signature in sexually committed cells. Results: We found this transcriptional signature to be distinct from general changes in parasite metabolism that can be observed in response to commitment-inducing conditions. Conclusions: This proof-of-concept study provides a template to capture transcriptional diversity in parasite populations containing complex mixtures of different life-cycle stages and developmental programs, with important implications for our understanding of parasite biology and the ongoing malaria elimination campaign
Probabilistic Treatment Planning for Carbon Ion Therapy
Intensity-modulated scanned particle therapy in combination with the characteristic depth dose deposition of carbon ions entail a higher sensitivity to physical changes of the patient geometry as compared to photons. As a result, carbon ions may stop at different spatial locations than predicted during treatment planning. But also the patient's response to radiation is uncertain thereby further compromising the quality of the radiation treatment plan. The unknown level of uncertainty in the carbon ion dose requires a patient specific uncertainty analysis and uncertainty mitigation. For this reason the thesis at hand presents a novel method to assess and quantify carbon ion treatment plan uncertainties considering physical uncertainties, biological uncertainties as well as fractionation effects. Second, the manuscript demonstrated how uncertainties were in a subsequent probabilistic optimization mitigated. The proposed methodology was applied to multiple clinical scenarios and its advantageous impact on the carbon ion treatment plan robustness was demonstrated.
Unlike protons, carbon ion treatment planning needs to account for the increased nonlinear cell killing of carbon ions in a mixed radiation field which increases the treatment planning complexity. With respect to uncertainties, not only the location of dose deposition is uncertain for carbon ions but also their effectiveness which consequently introduces biological uncertainties to treatment planning.
Different to scenario based approaches, this work presents exact and approximated nonlinear closed-form calculations of the expectation value and covariance of the RBE weighted dose accounting for setup-, range- and biological-uncertainties in fractionated carbon ion therapy. The developed analytical pipeline allows propagating linearly correlated Gaussian input uncertainties through the carbon ion pencil beam dose calculation algorithm to obtain uncertainties in dose.
With I and J being the number of voxels and pencil beams, respectively, low-rank tensor approximations were derived for the expectation value and standard deviation reducing the computational complexity from O(I x J^2) to O(I x J) and from O(I x J^4) to O(I x J^2) with minimal loss in accuracy. The consideration of biological errors introduces a new uncertainty structure in the analytical pipeline without increasing the computational complexity. The calculation of expected dose and variance influence information via APM allows performing a subsequent probabilistic optimization.
A proof of concept and several aspects such as accuracy, fractionation and the impact of different probability densities to model input uncertainties were studied in detail on a one-dimensional phantom case. Further, basic three-dimensional dose calculation and optimization functionalities were implemented in the open-source treatment planning system matRad. A subsequent validation against a clinical reference system revealed excellent agreement for elementary pencil beams and patient cases as indicated by γ-pass rates above 99.67%. Theoretical APM derivations were implemented on top and were then applied to clinical carbon ion patient cases. The expectation value and standard deviation of the RBE weighted dose were compared to estimated analogs stemming from 5000 random samples. The γ-pass rate exceeded 94.95% in all patient cases thereby proving the validity of the proposed analytical pipeline. A subsequent probabilistic optimization avoided underdosage of the target volume, reduced the integral dose and resulted in carbon ion treatment plans with a minimized standard deviation of RBE weighted dose. Thus the developed Analytical Probabilistic Model facilitates a flexible, effective and accurate probabilistic description of the radiation treatment plan and generalizes to probabilistic optimization.
In conclusion, the manuscript presents an analytical method to quantify and minimize the uncertainty in the delivery of carbon ion treatment plans. As a result, treatment plans became more robust against the involved uncertainties as demonstrated for a number of clinical scenarios
NASA/MSFC FY88 Global Scale Atmospheric Processes Research Program Review
Interest in environmental issues and the magnitude of the environmental changes continues. One way to gain more understanding of the atmosphere is to make measurements on a global scale from space. The Earth Observation System is a series of new sensors to measure globally atmospheric parameters. Analysis of satellite data by developing algorithms to interpret the radiance information improves the understanding and also defines requirements for these sensors. One measure of knowledge of the atmosphere lies in the ability to predict its behavior. Use of numerical and experimental models provides a better understanding of these processes. These efforts are described in the context of satellite data analysis and fundamental studies of atmospheric dynamics which examine selected processes important to the global circulation
Recommended from our members
The Town Lake Report, Volumes I and II
This report makes brief references to sediment and other trends seen in Waller Creek.EXECUTIVE SUMMARY: Town Lake’s importance as a natural resource is growing in tandem with Austin’s rapid population. The lake is a source of drinking water for the City, and its greenbelt and open waters are widely used for recreation and as a focal-point for public events. In 1992, under the Clean Lakes program, a comprehensive report entitled the “Town Lake Study” (COA 1992a; COA 1992b; COA 1992c) was prepared. It examined the condition of the lake (Volume I), water quality control alternatives (Volume II) and a feasibility study (Volume III). This report updates the diagnostic study, Volume I (COA 1992a), including the current status of water quality with data analyzed through the year 2000. It also includes a summary of measures taken to reduce pollution from urban runoff since 1990.Waller Creek Working Grou
Application of wavelet analysis in tool wear evaluation using image processing method
Tool wear plays a significant role for proper planning and control of machining parameters to maintain the product quality. However, existing tool wear monitoring methods using sensor signals still have limitations. Since the cutting tool operates directly on the work-piece during machining process, the machined surface provides valuable information about the cutting tool condition. Therefore, the objective of present study is to evaluate the tool wear based on the workpiece profile signature by using wavelet analysis. The effect of wavelet families, scale of wavelet and statistical features of the continuous wavelet coefficient on the tool wear is studied. The surface profile of workpiece was captured using a DSLR camera. Invariant moment method was applied to extract the surface profile up to sub-pixel accuracy. The extracted surface profile was analyzed by using continuous wavelet transform (CWT) written in MATLAB. The re-sults showed that average, RMS and peak to valley of CWT coefficients at all scale increased with tool wear. Peak to valley at higher scale is more sensitive to tool wear. Haar was found to be more effective and significant to correlate with tool wear with highest R2 which is 0.9301
- …