680 research outputs found

    GINI DP 9: Comparable Indicators of Inequality Across Countries

    Get PDF
    This paper addresses the key issue for the GINI project of how best to approach the measurement of income inequality and wage inequality to enhance comparability across different studies. It focuses fi rst on income inequality, dealing with the defi nition of income, the income recipient unit, and the unit of analysis. The summary measures used to capture inequality are also discussed, with an emphasis on capturing trends at different points in the distribution, and sources for comparative data on inequality levels and trends are discussed. The paper then turns to inequality in earnings among employees and discusses the same set of issues in that context. The above bears directly on any analysis of inequality itself but it is also important for an analysis of the direct impacts of inequality at micro-level. For a (multilevel) analysis based on aggregate inequality as an input the paper provides an understanding of the need for comparable concepts and defi nitions across countries and links to data sources as well as aggregate levels. It also links to practical experiences of researchers with different datasets. For this and the datasets see the Data Portal at "gini-research.org":http://www.gini-research.org

    On Ill-Conditioned Generalized Estimating Equations and Toward Unified Biased Estimation

    Get PDF
    I address the issue of ill-conditioned regressors within generalized estimating equations (GEEs). In such a setting, standard GEE approaches can have problems with: convergence, large coefficient variances, poor prediction, deflated power of tests, and in some extreme cases, e.g. functional regressors, may not even exist. I modify the quasi-likelihood score functions, while presenting a variety of biased estimators that simultaneously address the issues of (severe) ill-conditioning and correlated response variables. To simplify the presentation, I attempt to unite or link these estimators as much as possible. Some properties, as well as some guidelines for choosing the meta or penalty parameters are suggested

    Varying Coefficient Tensor Models for Brain Imaging

    Get PDF
    We revisit a multidimensional varying-coefficient model (VCM), by allowing regressor coefficients to vary smoothly in more than one dimension, thereby extending the VCM of Hastie and Tibshirani. The motivating example is 3-dimensional, involving a special type of nuclear magnetic resonance measurement technique that is being used to estimate the diffusion tensor at each point in the human brain. We aim to improve the current state of the art, which is to apply a multiple regression model for each voxel separately using information from six or more volume images. We present a model, based on P-spline tensor products, to introduce spatial smoothness of the estimated diffusion tensor. Since the regression design matrix is space-invariant, a 4-dimensional tensor product model results, allowing more efficient computation with penalized array regression

    Space-Varying Coefficient Models for Brain Imaging

    Get PDF
    The methodological development and the application in this paper originate from diffusion tensor imaging (DTI), a powerful nuclear magnetic resonance technique enabling diagnosis and monitoring of several diseases as well as reconstruction of neural pathways. We reformulate the current analysis framework of separate voxelwise regressions as a 3d space-varying coefficient model (VCM) for the entire set of DTI images recorded on a 3d grid of voxels. Hence by allowing to borrow strength from spatially adjacent voxels, to smooth noisy observations, and to estimate diffusion tensors at any location within the brain, the three-step cascade of standard data processing is overcome simultaneously. We conceptualize two VCM variants based on B-spline basis functions: a full tensor product approach and a sequential approximation, rendering the VCM numerically and computationally feasible even for the huge dimension of the joint model in a realistic setup. A simulation study shows that both approaches outperform the standard method of voxelwise regressions with subsequent regularization. Due to major efficacy, we apply the sequential method to a clinical DTI data set and demonstrate the inherent ability of increasing the rigid grid resolution by evaluating the incorporated basis functions at intermediate points. In conclusion, the suggested fitting methods clearly improve the current state-of-the-art, but ameloriation of local adaptivity remains desirable

    Mathematical Models for Natural Gas Forecasting

    Get PDF
    It is vital for natural gas Local Distribution Companies (LDCs) to forecast their customers\u27 natural gas demand accurately. A significant error on a single very cold day can cost the customers of the LDC millions of dollars. This paper looks at the financial implication of forecasting natural gas, the nature of natural gas forecasting, the factors that impact natural gas consumption, and describes a survey of mathematical techniques and practices used to model natural gas demand. Many of the techniques used in this paper currently are implemented in a software GasDayTM, which is currently used by 24 LDCs throughout the United States, forecasting about 20% of the total U.S. residential, commercial, and industrial consumption. Results of GasDay\u27sTM forecasting performance also is presented

    Twenty years of P-splines

    Get PDF
    P-splines first appeared in the limelight twenty years ago. Since then they have become popular in applications and in theoretical work. The combination of a rich B-spline basis and a simple difference penalty lends itself well to a variety of generalizations, because it is based on regression. In effect, P-splines allow the building of a “backbone” for the “mixing and matching” of a variety of additive smooth structure components, while inviting all sorts of extensions: varying-coefficient effects, signal (functional) regressors, two-dimensional surfaces, non-normal responses, quantile (expectile) modelling, among others. Strong connections with mixed models and Bayesian analysis have been established. We give an overview of many of the central developments during the first two decades of P-splines.Peer Reviewe

    Twenty years of P-splines

    Get PDF
    P-splines first appeared in the limelight twenty years ago. Since then they have become popular in applications and in theoretical work. The combination of a rich B-spline basis and a simple difference penalty lends itself well to a variety of generalizations, because it is based on regression. In effect, P-splines allow the building of a “backbone” for the “mixing and matching” of a variety of additive smooth structure components, while inviting all sorts of extensions: varying-coefficient effects, signal (functional) regressors, two-dimensional surfaces, non-normal responses, quantile (expectile) modelling, among others. Strong connections with mixed models and Bayesian analysis have been established. We give an overview of many of the central developments during the first two decades of P-splines

    Inducible Presynaptic Glutamine Transport Supports Glutamatergic Transmission at the Calyx of Held Synapse

    Get PDF
    he mechanisms by which the excitatory neurotransmitter glutamate is recycled at synapses are currently unknown. By examining the functional expression of plasma membrane transporters at presynaptic terminals, we aim to elucidate some of the mechanisms of glutamate recycling. Using whole-cell voltage-clamp recordings from rat calyx of Held presynaptic terminals, our data show, for the first time, that the glutamate precursor glutamine causes the direct activation of an electrogenic, sodium-dependent presynaptic transporter, which supplies glutamine for generation of presynaptic glutamate and helps sustain synaptic transmission. Interestingly, the functional expression of this transporter at the presynaptic plasma membrane is dynamically controlled by electrical activity of the terminal, indicating that uptake of neurotransmitter precursors is controlled by the demand at an individual terminal. Induction of the transporter current is calcium-dependent and inhibited by botulinum neurotoxin C, demonstrating the involvement of SNARE-dependent exocytosis in inserting transporters into the plasma membrane when the terminal is active. Conversely, inactivity of the presynaptic terminal results in removal of transporters via clathrin-mediated endocytosis. To investigate whether the presynaptic glutamine transporter supplies the precursor for generating the synaptically released glutamate, we measured miniature EPSCs to assess vesicular glutamate content. When the presynaptic glutamate pool was turned over by synaptic activity, inhibiting the presynaptic glutamine transporters with MeAIB reduced the miniature EPSC amplitude significantly. This demonstrates that presynaptic glutamine transport is centrally involved in the production of glutamate and assists in maintaining excitatory neurotransmission. © 2013 the authors

    Computer-based and paper-and-pencil tests: A study in calculus for STEM majors

    Get PDF
    Computer-based testing is an expanding use of technology offering advantages to teachers and students. We studied Calculus II classes for STEM majors using different testing modes. Three sections with 324 students employed: Paper-and-pencil testing, computer-based testing, and both. Computer tests gave immediate feedback, allowed multiple submissions, and pooling. Paper-and-pencil tests required work and explanation allowing inspection of high cognitive demand tasks. Each test mode used the strength of its method. Students were given the same lecture by the same instructor on the same day and the same homework assignments and due dates. The design is quasi-experimental, but students were not aware of the testing mode at registration. Two basic questions examined were: (1) Do paper-and-pencil and computer-based tests measure knowledge and skill in STEM Calculus II in a consistent manner? (2) How does the knowledge and skill gained by students in a fully computer-based Calculus II class compare to students in a class requiring pencil-and-paper tests and hence some paper-and-pencil work. These results indicate that computer-based tests are as consistent with paper-and-pencil tests as computer-based tests are with themselves. Results are also consistent with classes using paper-and-pencil tests having slightly better outcomes than fully computer-based classes using only computer assessments.Comment: 33 pages, 1 figure, 9 table

    Low Temperature Co-Fired Ceramics for Micro-Fluidics

    Get PDF
    The miniaturization of analytical instruments and packaging of novel sensors is an area that has attracted significant research interest and offers many opportunities for product commercialization. Low Temperature Co-fired Ceramics (LTCC) is a materials system composed of alumina and glass in an organic binder. LTCC is a good choice for sensor development because of the ease of incorporating features in the ‘green’ or unfired state such as electrical traces, fluidic pathways and passive electrical components. After a firing cycle, what remains is a robust, monolithic device with features embedded in the package. In order for LTCC to be a successful medium for small scale sensors or lab-in-package devices, fluid flow through channels and inlet/outlet ports must be perfected. Device inlet/outlet ports have been demonstrated by embedding sapphire tubes in LTCC, allowing external connections using compression fittings. Channels and cavities have been fabricated through the use of sacrificial carbon tapes and pastes. A field flow fractionation device used for separating or concentrating constituent components in a fluid and a multi-electrode electrochemical cell were fabricated with the techniques described in this paper
    corecore