14,347 research outputs found

    Signal-to-Noise Ratio in Heat-Assisted-Recording Media : A Comparison between Simulations and Experiments

    Get PDF
    We develop a code to extract the signal-to-noise ratio (SNR) arising from the magnetic film in a recording medium. The approach allows us to separate the remanence and transition contributions from the global spatial noise. The results are in excellent agreement with the analysis performed on the same data sets by means of Seagate proprietary software based on ensemble wave-form analysis. We then apply this analytical approach to the results of heat-assisted magnetic recording (HAMR) dynamics simulations by means of the open-source multi-time-scale micromagnetic code mars and compare these with experimental spin-stand measurements of analogous systems. The proposed model could be used as the standard tool to understand the underlying physics of the noise components affecting HAMR operations and how to decrease the noise arising from the medium to improve the writing performance of HAMR

    Development of a prototype for high-frequency mental health surveillance in Germany: data infrastructure and statistical methods

    Get PDF
    In the course of the COVID-19 pandemic and the implementation of associated non-pharmaceutical containment measures, the need for continuous monitoring of the mental health of populations became apparent. When the pandemic hit Germany, a nationwide Mental Health Surveillance (MHS) was in conceptual development at Germany’s governmental public health institute, the Robert Koch Institute. To meet the need for high-frequency reporting on population mental health we developed a prototype that provides monthly estimates of several mental health indicators with smoothing splines. We used data from the telephone surveys German Health Update (GEDA) and COVID-19 vaccination rate monitoring in Germany (COVIMO). This paper provides a description of the highly automated data pipeline that produces time series data for graphical representations, including details on data collection, data preparation, calculation of estimates, and output creation. Furthermore, statistical methods used in the weighting algorithm, model estimations for moving three-month predictions as well as smoothing techniques are described and discussed. Generalized additive modelling with smoothing splines best meets the desired criteria with regard to identifying general time trends. We show that the prototype is suitable for a population-based high-frequency mental health surveillance that is fast, flexible, and able to identify variation in the data over time. The automated and standardized data pipeline can also easily be applied to other health topics or other surveys and survey types. It is highly suitable as a data processing tool for the efficient continuous health surveillance required in fast-moving times of crisis such as the Covid-19 pandemic

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Ellipsoid fitting with the Cayley transform

    Full text link
    We introduce an algorithm, Cayley transform ellipsoid fitting (CTEF), that uses the Cayley transform to fit ellipsoids to noisy data in any dimension. Unlike many ellipsoid fitting methods, CTEF is ellipsoid specific -- meaning it always returns elliptic solutions -- and can fit arbitrary ellipsoids. It also outperforms other fitting methods when data are not uniformly distributed over the surface of an ellipsoid. Inspired by calls for interpretable and reproducible methods in machine learning, we apply CTEF to dimension reduction, data visualization, and clustering. Since CTEF captures global curvature, it is able to extract nonlinear features in data that other methods fail to identify. This is illustrated in the context of dimension reduction on human cell cycle data, and in the context of clustering on classical toy examples. In the latter case, CTEF outperforms 10 popular clustering algorithms

    On the Lipschitz Constant of Deep Networks and Double Descent

    Full text link
    Existing bounds on the generalization error of deep networks assume some form of smooth or bounded dependence on the input variable, falling short of investigating the mechanisms controlling such factors in practice. In this work, we present an extensive experimental study of the empirical Lipschitz constant of deep networks undergoing double descent, and highlight non-monotonic trends strongly correlating with the test error. Building a connection between parameter-space and input-space gradients for SGD around a critical point, we isolate two important factors -- namely loss landscape curvature and distance of parameters from initialization -- respectively controlling optimization dynamics around a critical point and bounding model function complexity, even beyond the training data. Our study presents novels insights on implicit regularization via overparameterization, and effective model complexity for networks trained in practice

    Nonparametric Multi-shape Modeling with Uncertainty Quantification

    Full text link
    The modeling and uncertainty quantification of closed curves is an important problem in the field of shape analysis, and can have significant ramifications for subsequent statistical tasks. Many of these tasks involve collections of closed curves, which often exhibit structural similarities at multiple levels. Modeling multiple closed curves in a way that efficiently incorporates such between-curve dependence remains a challenging problem. In this work, we propose and investigate a multiple-output (a.k.a. multi-output), multi-dimensional Gaussian process modeling framework. We illustrate the proposed methodological advances, and demonstrate the utility of meaningful uncertainty quantification, on several curve and shape-related tasks. This model-based approach not only addresses the problem of inference on closed curves (and their shapes) with kernel constructions, but also opens doors to nonparametric modeling of multi-level dependence for functional objects in general.Comment: 66 pages, 20 figure

    Characterising cosmic birefringence in the presence of galactic foregrounds and instrumental systematic effects

    Full text link
    We study a possibility of constraining isotropic cosmic birefringence with help of cosmic microwave background polarisation data in the presence of polarisation angle miscalibration without relying on any assumptions about the Galactic foreground angular power spectra and in particular on their EB correlation. We propose a new analysis framework based on a generalised parametric component separation approach, which accounts simultaneously on the presence of galactic foregrounds, relevant instrumental effects and external priors. We find that upcoming multi-frequency CMB data with appropriate calibration priors will allow producing an instrumental-effect-corrected and foreground-cleaned CMB map, which can be used to estimate the isotropic birefringence angle and the tensor-to-scalar ratio, accounting on statistical and systematic uncertainties incurred during the entire procedure. In particular, in the case of a Simons Observatory-like, three Small Aperture Telescopes, we derive an uncertainty on the birefringence angle of σ(βb)=0.07∘\sigma(\beta_{b}) = 0.07^\circ (0.1∘^\circ), assuming the standard cosmology and calibration priors for all (single) frequency channels with the precision of σ(αi)=0.1∘\sigma(\alpha_i)= 0.1^\circ as aimed at by the near future ground-based experiments. This implies that these experiments could confirm or disprove the recently detected value of βb=0.35∘\beta_b=0.35^\circ with a significance between 33 and 5σ5 \sigma. [abridged version]Comment: 20 pages, 9 figure

    Sputter deposition on composites : interplay between film and substrate properties

    Get PDF

    Weighted inhomogeneous regularization for inverse problems with indirect and incomplete measurement data

    Full text link
    Regularization promotes well-posedness in solving an inverse problem with incomplete measurement data. The regularization term is typically designed based on a priori characterization of the unknown signal, such as sparsity or smoothness. The standard inhomogeneous regularization incorporates a spatially changing exponent pp of the standard â„“p\ell_p norm-based regularization to recover a signal whose characteristic varies spatially. This study proposes a weighted inhomogeneous regularization that extends the standard inhomogeneous regularization through new exponent design and weighting using spatially varying weights. The new exponent design avoids misclassification when different characteristics stay close to each other. The weights handle another issue when the region of one characteristic is too small to be recovered effectively by the â„“p\ell_p norm-based regularization even after identified correctly. A suite of numerical tests shows the efficacy of the proposed weighted inhomogeneous regularization, including synthetic image experiments and real sea ice recovery from its incomplete wave measurements
    • …
    corecore