5,864 research outputs found

    The use of neural networks for the prediction of the critical factor of safety of an artificial slope subjected to earthquake forces

    Get PDF
    AbstractThis study deals with the development of Artificial Neural Network (ANN) and Multiple Regression (MR) models for estimating the critical factor of safety (Fs) value of a typical artificial slope subjected to earthquake forces. To achieve this, while the geometry of the slope and the properties of the man-made soil are kept constant, the natural subsoil properties, namely, cohesion, internal angle of friction, the bulk unit weight of the layer beneath the ground surface and the seismic coefficient, varied during slope stability analyses. Then, the Fs values of this slope were calculated using the simplified Bishop method, and the minimum (critical) Fs value for each case was determined and used in the development of the ANN and MR models. The results obtained from the models were compared with those obtained from the calculations. Moreover, several performance indices, such as determination coefficient, variance account for, mean absolute error and root mean square error, were calculated to check the prediction capacity of the models developed. The obtained indices make it clear that the ANN model has shown a higher prediction performance than the MR model

    With four Standard Model families, the LHC could discover the Higgs boson with a few fb^-1

    Get PDF
    The existence of a 4th SM family would produce a large enhancement of the gluon fusion channel of Higgs boson production at hadron colliders. In this case, the SM Higgs boson could be seen at the CERN Large Hadron Collider (LHC) via the golden mode (H->4l) with an integral luminosity of only a few fb^-1.Comment: 7 pages, 2 figures, 2 tables, references updated in v

    Java Computer Animation for Effective Learning of the Cholesky Algorithm with Transportation Engineering Applications

    Get PDF
    In this paper, the well-known Cholesky Algorithm (for solving simultaneous linear equations, or SLE) is re-visited, with the ultimate goal of developing a simple, user-friendly, attractive, and useful Java Visualization and Animation Graphical User Inter-face (GUI) software as an additional teaching tool for students to learn the Cholesky factorization in a step-by-step fashion with computer voice and animation. A demo video of the Cholesky Decomposition (or factorization) animation and result can be viewed online from the website: http://www.lions.odu.edu/~imako001/cholesky/demo/index.html. The software tool developed from this work can be used for both students and their instructors not only to master this technical subject, but also to provide a dynamic/valuable tool for obtaining the solutions for homework assignments, class examinations, self-assessment studies, and other coursework related activities. Various transportation engineering applications of SLE are cited. Engineering educators who have adopted “flipped classroom instruction” can also utilize this Java Visualization and Animation software for students to “self-learning” these algorithms at their own time (and at their preferable locations), and use valuable class-meeting time for more challenging (real-life) problems’ discussions. Statistical data for comparisons of students’ performance with and without using the developed Java computer animation are also included

    EEG Classification based on Image Configuration in Social Anxiety Disorder

    Get PDF
    The problem of detecting the presence of Social Anxiety Disorder (SAD) using Electroencephalography (EEG) for classification has seen limited study and is addressed with a new approach that seeks to exploit the knowledge of EEG sensor spatial configuration. Two classification models, one which ignores the configuration (model 1) and one that exploits it with different interpolation methods (model 2), are studied. Performance of these two models is examined for analyzing 34 EEG data channels each consisting of five frequency bands and further decomposed with a filter bank. The data are collected from 64 subjects consisting of healthy controls and patients with SAD. Validity of our hypothesis that model 2 will significantly outperform model 1 is borne out in the results, with accuracy 66--7%7\% higher for model 2 for each machine learning algorithm we investigated. Convolutional Neural Networks (CNN) were found to provide much better performance than SVM and kNNs

    Detection of fungal damaged popcorn using image property covariance features

    Get PDF
    Cataloged from PDF version of article.Covariance-matrix-based features were applied to the detection of popcorn infected by a fungus that causes a symptom called "blue-eye". This infection of popcorn kernels causes economic losses due to the kernels' poor appearance and the frequently disagreeable flavor of the popped kernels. Images of kernels were obtained to distinguish damaged from undamaged kernels using image-processing techniques. Features for distinguishing blue-eye-damaged from undamaged popcorn kernel images were extracted from covariance matrices computed using various image pixel properties. The covariance matrices were formed using different property vectors that consisted of the image coordinate values, their intensity values and the first and second derivatives of the vertical and horizontal directions of different color channels. Support Vector Machines (SVM) were used for classification purposes. An overall recognition rate of 96.5% was achieved using these covariance based features. Relatively low false positive values of 2.4% were obtained which is important to reduce economic loss due to healthy kernels being discarded as fungal damaged. The image processing method is not computationally expensive so that it could be implemented in real-time sorting systems to separate damaged popcorn or other grains that have textural differences. (C) 2012 Elsevier B.V. All rights reserve

    Identification of Novel Reference Genes Based on MeSH Categories

    Get PDF
    Cataloged from PDF version of article.Transcriptome experiments are performed to assess protein abundance through mRNA expression analysis. Expression levels of genes vary depending on the experimental conditions and the cell response. Transcriptome data must be diverse and yet comparable in reference to stably expressed genes, even if they are generated from different experiments on the same biological context from various laboratories. In this study, expression patterns of 9090 microarray samples grouped into 381 NCBI-GEO datasets were investigated to identify novel candidate reference genes using randomizations and Receiver Operating Characteristic (ROC) curves. The analysis demonstrated that cell type specific reference gene sets display less variability than a united set for all tissues. Therefore, constitutively and stably expressed, origin specific novel reference gene sets were identified based on their coefficient of variation and percentage of occurrence in all GEO datasets, which were classified using Medical Subject Headings (MeSH). A large number of MeSH grouped reference gene lists are presented as novel tissue specific reference gene lists. The most commonly observed 17 genes in these sets were compared for their expression in 8 hepatocellular, 5 breast and 3 colon carcinoma cells by RT-qPCR to verify tissue specificity. Indeed, commonly used housekeeping genes GAPDH, Actin and EEF2 had tissue specific variations, whereas several ribosomal genes were among the most stably expressed genes in vitro. Our results confirm that two or more reference genes should be used in combination for differential expression analysis of large-scale data obtained from microarray or next generation sequencing studies. Therefore context dependent reference gene sets, as presented in this study, are required for normalization of expression data from diverse technological backgrounds. © 2014 Ersahin et al

    Attributed relational graphs for cell nucleus segmentation in fluorescence microscopy Images

    Get PDF
    Cataloged from PDF version of article.More rapid and accurate high-throughput screening in molecular cellular biology research has become possible with the development of automated microscopy imaging, for which cell nucleus segmentation commonly constitutes the core step. Although several promising methods exist for segmenting the nuclei of monolayer isolated and less-confluent cells, it still remains an open problem to segment the nuclei of more-confluent cells, which tend to grow in overlayers. To address this problem, we propose a new model-based nucleus segmentation algorithm. This algorithm models how a human locates a nucleus by identifying the nucleus boundaries and piecing them together. In this algorithm, we define four types of primitives to represent nucleus boundaries at different orientations and construct an attributed relational graph on the primitives to represent their spatial relations. Then, we reduce the nucleus identification problem to finding predefined structural patterns in the constructed graph and also use the primitives in region growing to delineate the nucleus borders. Working with fluorescence microscopy images, our experiments demonstrate that the proposed algorithm identifies nuclei better than previous nucleus segmentation algorithms

    Characterization of sleep spindles using higher order statistics and spectra

    Get PDF
    Cataloged from PDF version of article.This work characterizes the dynamics of sleep spindles, observed in electroencephalogram (EEG) recorded from humans during sleep, using both time and frequency domain methods which depend on higher order statistics and spectra. The time domain method combines the use of second- and third-order correlations to reveal information on the stationarity of periodic spindle rhythms to detect transitions between multiple activities. The frequency domain method, based on normalized spectrum and bispectrum, describes frequency interactions associated with nonlinearities occuring in the observed EEG
    corecore