1,660 research outputs found

    Possible Signals of new phenomena in hadronic interactions at dn/deta=5.5+-1.2

    Get PDF
    The average transverse momentum dependence on multiplicity shows in many experiments at center of mass energies ranging from 22 to 7000 GeV a slope change at a charged particle rapidity density constant within systematic uncertainties. We find correlated signals which together with the slope change may indicate a transition to a new mechanism of particles production.Comment: 32 pages,84 figures,preprin

    Texture classification using invariant ranklet features

    Get PDF
    A novel invariant texture classification method is proposed. Invariance to linear/non-linear monotonic gray-scale transformations is achieved by submitting the image under study to the ranklet transform, an image processing technique relying on the analysis of the relative rank of pixels rather than on their gray-scale value. Some texture features are then extracted from the ranklet images resulting from the application at different resolutions and orientations of the ranklet transform to the image. Invariance to 90°-rotations is achieved by averaging, for each resolution, correspondent vertical, horizontal, and diagonal texture features. Finally, a texture class membership is assigned to the texture feature vector by using a support vector machine (SVM) classifier. Compared to three recent methods found in literature and having being evaluated on the same Brodatz and Vistex datasets, the proposed method performs better. Also, invariance to linear/non-linear monotonic gray-scale transformations and 90°-rotations are evidenced by training the SVM classifier on texture feature vectors formed from the original images, then testing it on texture feature vectors formed from contrast-enhanced, gamma-corrected, histogram-equalized, and 90°-rotated images

    OPTIMIZATION OF A DISTRIBUTED GENETIC ALGORITHM ON A CLUSTER OF WORKSTATIONS FOR THE DETECTION OF MICROCALCIFICATIONS

    Get PDF
    We have developed a method for the detection of clusters of microcalcifications in digital mammograms. Here, we present a genetic algorithm used to optimize the choice of the parameters in the detection scheme. The optimization has allowed the improvement of the performance, the detailed study of the influence of the various parameters on the performance and an accurate investigation of the behavior of the detection method on unknown cases. We reach a sensitivity of 96.2% with 0.7 false positive clusters per image on the Nijmegen database; we are also able to identify the most significant parameters. In addition, we have examined the feasibility of a distributed genetic algorithm implemented on a non-dedicated Cluster Of Workstations. We get very good results both in terms of quality and efficiency

    A New Approach to Image Reconstruction in Positron Emission Tomography Using Artificial Neural Networks

    Get PDF
    This study investigates the possibility of using an Artificial Neural Network (ANN) for reconstructing Positron Emission Tomography (PET) images. The network is trained with simulated data which include physical effects such as attenuation and scattering. Once the training ends, the weights of the network are held constant. The network is able to reconstruct every type of source distribution contained inside the area mapped during the learning. The reconstruction of a simulated brain phantom in a noiseless case shows an improvement if compared with Filtered Back-Projection reconstruction (FBP). In noisy cases there is still an improvement, even if we do not compensate for noise fluctuations. These results show that it is possible to reconstruct PET images using ANNs. Initially we used a Dec Alpha; then, due to the high data parallelism of this reconstruction problem, we ported the learning on a Quadrics (SIMD) machine, suited for the realization of a small medical dedicated system. These results encourage us to continue in further studies that will make possible reconstruction of images of bigger dimension than those used in the present work (32 × 32 pixels)

    SYSTEM FOR AUTOMATIC DETECTION OF CLUSTERED MICROCALCIFICATIONS IN DIGITAL MAMMOGRAMS

    Get PDF
    In this paper, we investigate the performance of a Computer Aided Diagnosis (CAD) system for the detection of clustered microcalcifications in mammograms. Our detection algorithm consists of the combination of two different methods. The first, based on difference-image techniques and gaussianity statistical tests, finds out the most obvious signals. The second, is able to discover more subtle microcalcifications by exploiting a multiresolution analysis by means of the wavelet transform. We can separately tune the two methods, so that each one of them is able to detect signals with similar features. By combining signals coming out from the two parts through a logical OR operation, we can discover microcalcifications with different characteristics. Our algorithm yields a sensitivity of 91.4% with 0.4 false positive cluster per image on the 40 images of the Nijmegen database

    Experimental equation of state in proton-proton and proton-antiproton collisions and phase transition to quark gluon plasma

    Get PDF
    We deduce approximate equations of state from experimental measurements in proton-proton and proton-antiproton collisions. Thermodynamic quantities are estimated combining the measure of average transverse momentum vs pseudorapidity density dN/deta with the estimation of the interaction region size from measures of Bose Einstein correlation, or from a theoretical model which relates dN/deta to the impact parameter. The results are very similar to theory predictions in case of crossover from hadron gas to quark gluon plasma. According to our analysis, the possible crossover should start at dN/deta about 6 and end at dN/deta about 24.Comment: 26 pages, 6 figure

    Surface-antigen expression profiling of B cell chronic lymphocytic leukemia: from the signature of specific disease subsets to the identification of markers with prognostic relevance

    Get PDF
    Studies of gene expression profiling have been successfully used for the identification of molecules to be employed as potential prognosticators. In analogy with gene expression profiling, we have recently proposed a novel method to identify the immunophenotypic signature of B-cell chronic lymphocytic leukemia subsets with different prognosis, named surface-antigen expression profiling. According to this approach, surface marker expression data can be analysed by data mining tools identical to those employed in gene expression profiling studies, including unsupervised and supervised algorithms, with the aim of identifying the immunophenotypic signature of B-cell chronic lymphocytic leukemia subsets with different prognosis. Here we provide an overview of the overall strategy employed for the development of such an "outcome class-predictor" based on surface-antigen expression signatures. In addition, we will also discuss how to transfer the obtained information into the routine clinical practice by providing a flow-chart indicating how to select the most relevant antigens and build-up a prognostic scoring system by weighing each antigen according to its predictive power. Although referred to B-cell chronic lymphocytic leukemia, the methodology discussed here can be also useful in the study of diseases other than B-cell chronic lymphocytic leukemia, when the purpose is to identify novel prognostic determinants

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI
    corecore