140,823 research outputs found
How to avoid potential pitfalls in recurrence plot based data analysis
Recurrence plots and recurrence quantification analysis have become popular
in the last two decades. Recurrence based methods have on the one hand a deep
foundation in the theory of dynamical systems and are on the other hand
powerful tools for the investigation of a variety of problems. The increasing
interest encompasses the growing risk of misuse and uncritical application of
these methods. Therefore, we point out potential problems and pitfalls related
to different aspects of the application of recurrence plots and recurrence
quantification analysis
Can Zipf's law be adapted to normalize microarrays?
BACKGROUND: Normalization is the process of removing non-biological sources of variation between array experiments. Recent investigations of data in gene expression databases for varying organisms and tissues have shown that the majority of expressed genes exhibit a power-law distribution with an exponent close to -1 (i.e. obey Zipf's law). Based on the observation that our single channel and two channel microarray data sets also followed a power-law distribution, we were motivated to develop a normalization method based on this law, and examine how it compares with existing published techniques. A computationally simple and intuitively appealing technique based on this observation is presented. RESULTS: Using pairwise comparisons using MA plots (log ratio vs. log intensity), we compared this novel method to previously published normalization techniques, namely global normalization to the mean, the quantile method, and a variation on the loess normalization method designed specifically for boutique microarrays. Results indicated that, for single channel microarrays, the quantile method was superior with regard to eliminating intensity-dependent effects (banana curves), but Zipf's law normalization does minimize this effect by rotating the data distribution such that the maximal number of data points lie on the zero of the log ratio axis. For two channel boutique microarrays, the Zipf's law normalizations performed as well as, or better than existing techniques. CONCLUSION: Zipf's law normalization is a useful tool where the Quantile method cannot be applied, as is the case with microarrays containing functionally specific gene sets (boutique arrays)
Understanding fragility in supercooled Lennard-Jones mixtures. II. Potential energy surface
We numerically investigated the connection between isobaric fragility and the
properties of high-order stationary points of the potential energy surface in
different supercooled Lennard-Jones mixtures. The increase of effective
activation energies upon supercooling appears to be driven by the increase of
average potential energy barriers measured by the energy dependence of the
fraction of unstable modes. Such an increase is sharper, the more fragile is
the mixture. Correlations between fragility and other properties of high-order
stationary points, including the vibrational density of states and the
localization features of unstable modes, are also discussed.Comment: 13 pages, 13 figures, minor revisions, one figure adde
Analysis of Flexural Strength and Contact Pressure After Simulated Chairside Adjustment of Pressed Lithium Disilicate Glass-Ceramic
Statement of problem Research evaluating load-to-failure of pressed lithium disilicate glass-ceramic (LDGC) with a clinically validated test after adjustment and repair procedures is scarce. Purpose The purpose of this in vitro study was to investigate the effect of the simulated chairside adjustment of the intaglio surface of monolithic pressed LDGC and procedures intended to repair damage. Material and methods A total of 423 IPS e.max Press (Ivoclar Vivadent AG) disks (15 mm diameter, 1 mm height) were used in the study. The material was tested by using an equibiaxial loading arrangement (n≥30/group) and a contact pressure test (n≥20/group). Specimens were assigned to 1 of 14 groups. One-half was assigned to the equibiaxial load test and the other half underwent contact pressure testing. Testing was performed in 2 parts, before glazing and after glazing. Before-glazing specimens were devested and entered in the test protocol, while after-glazing specimens were devested and glazed before entering the test protocol. Equibiaxial flexure test specimens were placed on a ring-on-ring apparatus and loaded until failure. Contact pressure specimens were cemented to epoxy resin blocks with a resin cement and loaded with a 50-mm diameter hemisphere until failure. Tests were performed on a universal testing machine with a crosshead speed of 0.5 mm/min. Weibull statistics and likelihood ratio contour plots determined intergroup differences (95% confidence bounds). Results Before glazing, the equibiaxial flexural strength test and the Weibull and likelihood ratio contour plots demonstrated a significantly higher failure strength for 1EC (188 MPa) than that of the damaged and/or repaired groups. Glazing following diamond-adjustment (1EGG) was the most beneficial post-damage procedure (176 MPa). Regarding the contact pressure test, the Weibull and likelihood ratio contour plots revealed no significant difference between the 1PC (98 MPa) and 1PGG (98 MPa) groups. Diamond-adjustment, without glazing (1EG and 1PG), resulted in the next-to-lowest equibiaxial flexure strength and the lowest contact pressure. After glazing, the strength of all the groups, when subjected to glazing following devesting, increased in comparison with corresponding groups in the before-glazing part of the study. Conclusions A glazing treatment improved the mechanical properties of diamond-adjusted IPS e.max Press disks when evaluated by equibiaxial flexure and contact pressure tests. Clinical Implications When adjustments are made on the intaglio surface of a pressed lithium disilicate glass-ceramic, a subsequent glazing treatment is recommended to improve strength
Automation potential of a new, rapid, microscopy based method for screening drug-polymer solubility
For the pharmaceutical industry, the preformulation screening of the compatibility of drug and polymeric excipients can often be time-consuming because of the use of trial-and-error approaches. This is also the case for selecting highly effective polymeric excipients for forming molecular dispersions in order to improve the dissolution and subsequent bio-availability of a poorly soluble drug. Previously, we developed a new thermal imaging-based rapid screening method, thermal analysis by structure characterization (TASC), which can rapidly detect the melting point depression of a crystalline drug in the presence of a polymeric material. In this study, we used melting point depression as an indicator of drug solubility in a polymer and further explored the potential of using the TASC method to rapidly screen and identify polymers in which a drug is likely to have high solubility. Here, we used a data bank of 5 model drugs and 10 different pharmaceutical grade polymers to validate the screening potential of TASC. The data indicated that TASC could provide significant improvement in the screening speed and reduce the materials used without compromising the sensitivity of detection. It should be highlighted that the current method is a screening method rather than a method that provides absolute measurement of the degree of solubility of a drug in a polymer. The results of this study confirmed that the TASC results of each drug-polymer pair could be used in data matrices to indicate the presence of significant interaction and solubility of the drug in the polymer. This forms the foundation for automating the screening process using artificial intelligence
Identification of Gastroenteric Viruses by Electron Microscopy Using Higher Order Spectral Features
Background: Many paediatric illnesses are caused by viral agents, for example, acute gastroenteritis. Electron microscopy can provide images of viral particles and can be used to identify the agents. Objectives: The use of electron microscopy as a diagnostic tool is limited by the need for high level of expertise in interpreting these images and the time required. A semi-automated method is proposed in this paper. Study design: The method is based on bispectal features that capture contour and texture information while providing robustness to shift, rotation, changes in size and noise. The magnification or true size of the viral particles need not be known precisely, but if available can be used additionally for improved classification. Viral particles from one or more images are segmented and analyzed to verify whether they belong to a particular class (such as Adenovirus, Rotavirus, etc.) or not. Two experiments were conducted—depending on the populations from which virus particle images were collected for training and testing, respectively. In the first, disjoint subsets from a pooled population of virus particles obtained from several images were used. In the second, separate populations from separate images were used. The performance of the method on viruses of similar size was separately evaluated using Astrovirus, HAV and Poliovirus. A Gaussian Mixture Model was used for the probability density of the features. A threshold on the log-likelihood is varied to study false alarm and false rejection trade-off. Features from many particles and/or likelihoods from independent tests are averaged to yield better performance. Results: An equal error rate (EER) of 2% is obtained for verification of Rotavirus (tested against three other viruses) when features from 15 viral particle images are averaged. It drops further to less than 0.2% when scores from two tests are averaged to make a decision. For verification of Astrovirus (tested against two others of the same size) the EER was less than 2% when 20 particles and two tests were used. Conclusion: Bispectral features and Gaussian mixture modelling of their probability density are shown to be effective in identifying viruses from electron microscope images. With the use of digital imaging in electron microscopes, this method can be fully automated
Recommended from our members
Sound transmission testing of polymer compounds
This is the post-print version of the final paper published in Polymer Testing. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2012 Elsevier Ltd.Acoustic properties of polymer compounds are an important consideration for many applications. Currently, there are standard test methods for the determination of these properties. There is, however, no standard for the equipment used in these tests, only a specification for the test conditions. The objective of this work was to evaluate the operation and performance of a bench top laboratory sound testing system for its potential as a simple cost effective method for the initial evaluation of materials that require specific acoustic properties. The work was limited to an investigation of the property of sound transmission loss (STL). A study of the effect of the mounting conditions for the samples on the STL was carried out. Following this, a series of polymer and polymer composite samples was tested. The results presented demonstrate the potential for the testing system as an effective standard test method for the acoustic properties of polymer composites and other materials.Technology Strategy Board, U
- …