8,176 research outputs found

    Confidence Statements for Efficiency Estimates from Stochastic Frontier Models

    Get PDF
    This paper is an empirical study of the uncertainty associated with estimates from stochastic frontier models. We show how to construct confidence intervals for estimates of technical efficiency levels under different sets of assumptions ranging from the very strong to the relatively weak. We demonstrate empirically how the degree of uncertainty associated with these estimates relates to the strength of the assumptions made and to various features of the data.Confidence intervals, stochastic frontier models, efficiency measurement

    HoPP: Robust and Resilient Publish-Subscribe for an Information-Centric Internet of Things

    Full text link
    This paper revisits NDN deployment in the IoT with a special focus on the interaction of sensors and actuators. Such scenarios require high responsiveness and limited control state at the constrained nodes. We argue that the NDN request-response pattern which prevents data push is vital for IoT networks. We contribute HoP-and-Pull (HoPP), a robust publish-subscribe scheme for typical IoT scenarios that targets IoT networks consisting of hundreds of resource constrained devices at intermittent connectivity. Our approach limits the FIB tables to a minimum and naturally supports mobility, temporary network partitioning, data aggregation and near real-time reactivity. We experimentally evaluate the protocol in a real-world deployment using the IoT-Lab testbed with varying numbers of constrained devices, each wirelessly interconnected via IEEE 802.15.4 LowPANs. Implementations are built on CCN-lite with RIOT and support experiments using various single- and multi-hop scenarios

    Sampling Errors and Confidence Intervals for Order Statistics: Implementing the Family Support Act

    Get PDF
    The Family Support Act allows states to reimburse child care costs up to the 75th percentile of local market price for child care. States must carry out surveys to estimate these 75th percentiles. This estimation problem raises two major statistical issues: (1) picking a sample design that will allow one to estimate the percentiles cheaply, efficiently and equitably; and (2) assessing the sampling variability of the estimates obtained. For Massa- chusetts, we developed a sampling design that equalized the standard errors of the estimated percentiles across 65 distinct local markets. This design was chosen because state administrators felt public day care providers and child advocates would find it equitable, thus limiting costly appeals. Estimation of standard errors for the sample 75th percentiles requires estimation of the density of the population at the 75th percentile. We implement and compare a number of parametric and nonparametric methods of density estimation. A kernel estimator provides the most reasonable estimates. On the basis of the mean integrated squared error criterion we selected the Epanechnikov kernel and the Sheather-Jones automatic bandwidth selection procedure. Because some of our sample sizes were too small to rely on asymptotics, we also constructed nonparametric confidence intervals using the hypergeometric distrition. For most of our samples, these confidence intervals were similar to those based on the asymptotic standard errors. Substantively we find wide variation in the price of child care, depending on the child's age, type of care and geographic location. For full-time care, the 75th percentiles ranged from 242perweekforinfantsinchildcarecentersinBostonto242 per week for infants in child care centers in Boston to 85 per week for family day care in western Massachusetts.

    Panel Data Models with Multiple Time-Varying Individual Effects

    Get PDF
    This paper considers a panel data model with time-varying individual effects. The data are assumed to contain a large number of cross-sectional units repeatedly observed over a fixed number of time periods. The model has a feature of the fixed-effects model in that the effects are assumed to be correlated with the regressors. The unobservable individual effects are assumed to have a factor structure. For consistent estimation of the model, it is important to estimate the true number of factors. We propose a generalized methods of moments procedure by which both the number of factors and the regression coefficients can be consistently estimated. Some important identification issues are also discussed. Our simulation results indicate that the proposed methods produce reliable estimates.panel data, time-varying individual effects, factor models

    Primary Care Validation of a Single-Question Alcohol Screening Test

    Get PDF
    BACKGROUND Unhealthy alcohol use is prevalent but under-diagnosed in primary care settings. OBJECTIVE To validate, in primary care, a single-item screening test for unhealthy alcohol use recommended by the National Institute on Alcohol Abuse and Alcoholism (NIAAA). DESIGN Cross-sectional study. PARTICIPANTS Adult English-speaking patients recruited from primary care waiting rooms. MEASUREMENTS Participants were asked the single screening question, "How many times in the past year have you had X or more drinks in a day?", where X is 5 for men and 4 for women, and a response of >1 is considered positive. Unhealthy alcohol use was defined as the presence of an alcohol use disorder, as determined by a standardized diagnostic interview, or risky consumption, as determined using a validated 30-day calendar method. MAIN RESULTS Of 394 eligible primary care patients, 286 (73%) completed the interview. The single-question screen was 81.8% sensitive (95% confidence interval (CI) 72.5% to 88.5%) and 79.3% specific (95% CI 73.1% to 84.4%) for the detection of unhealthy alcohol use. It was slightly more sensitive (87.9%, 95% CI 72.7% to 95.2%) but was less specific (66.8%, 95% CI 60.8% to 72.3%) for the detection of a current alcohol use disorder. Test characteristics were similar to that of a commonly used three-item screen, and were affected very little by subject demographic characteristics. CONCLUSIONS. The single screening question recommended by the NIAAA accurately identified unhealthy alcohol use in this sample of primary care patients. These findings support the use of this brief screen in primary care.National Institute on Alcohol Abuse and Alcoholism (R01-AA010870

    Otto Stern (1888-1969): The founding father of experimental atomic physics

    Full text link
    We review the work and life of Otto Stern who developed the molecular beam technique and with its aid laid the foundations of experimental atomic physics. Among the key results of his research are: the experimental determination of the Maxwell-Boltzmann distribution of molecular velocities (1920), experimental demonstration of space quantization of angular momentum (1922), diffraction of matter waves comprised of atoms and molecules by crystals (1931) and the determination of the magnetic dipole moments of the proton and deuteron (1933).Comment: 39 pages, 8 figure

    Multiple Comparisons with the Best, with Economic Applications

    Get PDF
    In this paper we discuss a statistical method called multiple comparisons with the best, or MCB. Suppose that we have N populations, and population i has parameter value ξi. Let ξ(N)=maxi=1,
,Nξi\theta _{(N)}={\rm max}_{i=1,\ldots ,N}\theta _{i}\nopagenumbers\end, the parameter value for the ‘best’ population. Then MCB constructs joint confidence intervals for the differences [ξ(N)−ξ1,ξ(N)−ξ2,
,ξ(N)−ξN][\theta _{(N)}-\theta _{1},\theta _{(N)}-\theta _{2},\ldots ,\theta _{(N)}-\theta _{N}]\nopagenumbers\end. It is not assumed that it is known which population is best, and part of the problem is to say whether any population is so identified, at the given confidence level. This paper is meant to introduce MCB to economists. We discuss possible uses of MCB in economics. The application that we treat in most detail is the construction of confidence intervals for inefficiency measures from stochastic frontier models with panel data. We also consider an application to the analysis of labour market wage gaps

    Confidence Statements for Efficiency Estimates from Stochastic Frontier Models

    Get PDF
    This paper is an empirical study of the uncertainty associated with technical efficiency estimates from stochastic frontier models. We show how to construct confidence intervals for estimates of technical efficiency levels under different sets of assumptions ranging from the very strong to the relatively weak. We demonstrate empirically how the degree of uncertainty associated with these estimates relates to the strength of the assumptions made and to various features of the data

    Dynamics of confined water reconstructed from inelastic x-ray scattering measurements of bulk response functions

    Get PDF
    Nanoconfined water and surface-structured water impacts a broad range of fields. For water confined between hydrophilic surfaces, measurements and simulations have shown conflicting results ranging from “liquidlike” to “solidlike” behavior, from bulklike water viscosity to viscosity orders of magnitude higher. Here, we investigate how a homogeneous fluid behaves under nanoconfinement using its bulk response function: The Green's function of water extracted from a library of S(q,ω) inelastic x-ray scattering data is used to make femtosecond movies of nanoconfined water. Between two confining surfaces, the structure undergoes drastic changes as a function of surface separation. For surface separations of ≈9 Å, although the surface-associated hydration layers are highly deformed, they are separated by a layer of bulklike water. For separations of ≈6 Å, the two surface-associated hydration layers are forced to reconstruct into a single layer that modulates between localized “frozen’ and delocalized “melted” structures due to interference of density fields. These results potentially reconcile recent conflicting experiments. Importantly, we find a different delocalized wetting regime for nanoconfined water between surfaces with high spatial frequency charge densities, where water is organized into delocalized hydration layers instead of localized hydration shells, and are strongly resistant to `freezing' down to molecular distances (<6 Å)

    Flavour symmetry breaking in the kaon parton distribution amplitude

    Get PDF
    We compute the kaon's valence-quark (twist-two parton) distribution amplitude (PDA) by projecting its Poincare'-covariant Bethe-Salpeter wave-function onto the light-front. At a scale \zeta=2GeV, the PDA is a broad, concave and asymmetric function, whose peak is shifted 12-16% away from its position in QCD's conformal limit. These features are a clear expression of SU(3)-flavour-symmetry breaking. They show that the heavier quark in the kaon carries more of the bound-state's momentum than the lighter quark and also that emergent phenomena in QCD modulate the magnitude of flavour-symmetry breaking: it is markedly smaller than one might expect based on the difference between light-quark current masses. Our results add to a body of evidence which indicates that at any energy scale accessible with existing or foreseeable facilities, a reliable guide to the interpretation of experiment requires the use of such nonperturbatively broadened PDAs in leading-order, leading-twist formulae for hard exclusive processes instead of the asymptotic PDA associated with QCD's conformal limit. We illustrate this via the ratio of kaon and pion electromagnetic form factors: using our nonperturbative PDAs in the appropriate formulae, FK/Fπ=1.23F_K/F_\pi=1.23 at spacelike-Q2=17 GeV2Q^2=17\,{\rm GeV}^2, which compares satisfactorily with the value of 0.92(5)0.92(5) inferred in e+e−e^+ e^- annihilation at s=17 GeV2s=17\,{\rm GeV}^2.Comment: 7 pages, 2 figures, 3 table
    • 

    corecore