1,604 research outputs found

    Grid Computing: Concepts and Applications

    Get PDF
    The challenge of CERN experiments at the Large Hadron Collider (LHC), which will collect data at rates in the range of PBs/year, requires the development of GRID technologies to optimize the exploitation of distributed computing power and the automatic access to distributed data storage. Several projects are addressing the problem of setting up the hardware infrastructure of a GRID, as well as the development of the middleware required to manage it: a working GRID should look like a set of services, accessible to registered applications, which will help cooperate the different computing and storage resources. As it happened for the World Wide Web, GRID concepts are in principle important not only for High Energy Physics (HEP): for this reason, GRID developers, while keeping in mind the needs of HEP experiments, are trying to design GRID services in the most general way. As examples, two applications are described: the CERN/ALICE experiment at the LHC and a recently approved INFN project (GPCALMA) which will set up a GRID prototype between several mammographic centres in Italy

    GPCALMA: a Grid Approach to Mammographic Screening

    Get PDF
    The next generation of High Energy Physics experiments requires a GRID approach to a distributed computing system and the associated data management: the key concept is the "Virtual Organisation" (VO), a group of geographycally distributed users with a common goal and the will to share their resources. A similar approach is being applied to a group of Hospitals which joined the GPCALMA project (Grid Platform for Computer Assisted Library for MAmmography), which will allow common screening programs for early diagnosis of breast and, in the future, lung cancer. HEP techniques come into play in writing the application code, which makes use of neural networks for the image analysis and shows performances similar to radiologists in the diagnosis. GRID technologies will allow remote image analysis and interactive online diagnosis, with a relevant reduction of the delays presently associated to screening programs.Comment: 4 pages, 3 figures; to appear in the Proceedings of Frontier Detectors For Frontier Physics, 9th Pisa Meeting on Advanced Detectors, 25-31 May 2003, La Biodola, Isola d'Elba, Ital

    Computer-aided detection systems to improve lung cancer early diagnosis: state-of-the-art and challenges

    Get PDF
    Lung cancer is one of the most lethal types of cancer, because its early diagnosis is not good enough. In fact, the detection of pulmonary nodule, potential lung cancers, in Computed Tomography scans is a very challenging and time-consuming task for radiologists. To support radiologists, researchers have developed Computer-Aided Diagnosis (CAD) systems for the automated detection of pulmonary nodules in chest Computed Tomography scans. Despite the high level of technological developments and the proved benefits on the overall detection performance, the usage of Computer-Aided Diagnosis in clinical practice is far from being a common procedure. In this paper we investigate the causes underlying this discrepancy and present a solution to tackle it: the M5L WEB- and Cloud-based on-demand Computer- Aided Diagnosis. In addition, we prove how the combination of traditional imaging processing techniques with state-of-art advanced classification algorithms allows to build a system whose performance could be much larger than any Computer-Aided Diagnosis developed so far. This outcome opens the possibility to use the CAD as clinical decision support for radiologists

    Unstructured MEL modelling of non linear 3D Ship hydrodynamics

    No full text
    In the present work the investigations of non linear effects, in the context of potential flow theory, are investigated. These effects are caused by three main reasons, namely: the changes of the wetted geometry of the floating body, the water line dynamics and the fully non linear nature of the free surface boundary conditions. In order to understand the importance of tackling the non linear effects, a three dimensional frequency study of the S175 conteinership is carried out, at different Froude numbers, using linear frequency domain methods and a partly non linear time domain method.A time domain analysis, with the aid of an unstructured mixed Eulerian Lagrangian (MEL) description of the fluid flow, is implemented aiming in exploring potential low non linear effects. In this framework, the mixed boundary value problem of the Eulerian phase of the MEL scheme is tackled by means of a Boundary Element Method using constant elements (or a direct Rankine panel method). At given time step, on Neumann boundaries the impervious boundary condition is specified whereas, on Dirichlet boundaries, the potential on the free surface is prescribed. The solution of the Boundary Value problem yields the potential on the Neumann boundaries and its normal derivative on Dirichlet boundaries. In the Lagrangian phase, the free surface boundary conditions are then integrated in time. This method was used to solve the linear time domain radiation, i.e by applying linearized free surface boundary conditions on the exact free surface and solving the mixed boundary value problem on the mean undisturbed free surface, for the case of forced motions of a hemisphere and a Wigley hull. In addition, the linear time domain method is also extended to the unified hydroelastic analysis in time domain for the cases of 2 and 3 nodes bending. Results are presented for the Wigley hull, undergoing prescribed forced oscillations for both rigid and flexible mode shapes.The extension of the MEL scheme to a numerical tool capable of addressing several degrees of non linearities (from body nonlinear to fully nonlinear) is also discussed. In this context, two numerical formulations to calculate the time derivative of the velocity potential are implemented, namely: a backward finite scheme and an exact calculation based in the time harmonic property of the velocity potential. In latter case, a second boundary value problem is constructed and solved for the time derivative of the potential on Neumann boundaries and for the normal acceleration on Dirichlet boundaries. Results of both approaches are compared for the case of a sphere undergoing force oscillations in heave are compared to results obtained by other time domain methods. Moreover, after the boundary value problem is solved, a radial basis function representation of the velocity potential and free surface elevation is constructed, this approach allows for the estimation of the gradient of the velocity potential (body nonlinear and fully nonlinear simulations) and free surface steepness (fully nonlinear simulations). The results of the body non linear analysis, for large amplitude of oscillation in heave, are presented for the both the sphere and Wigley hull. For the latter, body non linear results of the coupling between heave into the first distortion mode (2-node) are also presented. The results of the fully non linear simulations are presented for the case of a sphere.An investigation of the suitability of two unstructured meshing libraries is also performed in the context of the MEL simulation scheme. Practical issues related to (re)meshing at each time step, the representation of ship like geometries, free surface evolution and numerical stability are highlighted for both libraries

    Sobre o processo de edição dos textos jesuítas nas cartas da América portuguesa no século XVI

    Get PDF
    A partir da obra Cartas dos primeiros jesuítas do Brasil (1954)) organizada por Serafim Leite, este artigo buscará estudar alguns aspectos do processo de produção livresca do século xvi, em particular as condições de redação e edi­ ção dos manuscritos dentro e fora do ambiente da Igreja Católica e a circulação na Metrópole e na Colônia. Partindo da observação de um momento histórico em que a maneira de ler, escrever e difundir os textos sofreu mudanças profundas, pretende-se reconstruir parte da história da cultura material do livro no século xviConsidering the work Cartas dos primeiros jesuítas do Brasil (1954), organized by Serafim Leite, this article is an attempt at studying some aspects of book production process in the sixteenth century, particularly concerning writing conditions, publishing and circulation of manuscripts (inside and outside Catholic Church boundaries) as well as distribution and trading in the Metropolis and Colony. Taking into account that texts underwent considerable changes, we intend to recount part of history regarding material culture of book in the sixteent

    AliEn - EDG Interoperability in ALICE

    Full text link
    AliEn (ALICE Environment) is a GRID-like system for large scale job submission and distributed data management developed and used in the context of ALICE, the CERN LHC heavy-ion experiment. With the aim of exploiting upcoming Grid resources to run AliEn-managed jobs and store the produced data, the problem of AliEn-EDG interoperability was addressed and an in-terface was designed. One or more EDG (European Data Grid) User Interface machines run the AliEn software suite (Cluster Monitor, Storage Element and Computing Element), and act as interface nodes between the systems. An EDG Resource Broker is seen by the AliEn server as a single Computing Element, while the EDG storage is seen by AliEn as a single, large Storage Element; files produced in EDG sites are registered in both the EDG Replica Catalogue and in the AliEn Data Catalogue, thus ensuring accessibility from both worlds. In fact, both registrations are required: the AliEn one is used for the data management, the EDG one to guarantee the integrity and access to EDG produced data. A prototype interface has been successfully deployed using the ALICE AliEn Server and the EDG and DataTAG Testbeds.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003,4 pages, PDF, 2 figures. PSN TUCP00

    Predictive models based on Support Vector Machines: whole-brain versus regional analysis of structural MRI in the Alzheimer’s disease

    Get PDF
    Decision-making systems trained on structural magnetic resonance imaging data of subjects affected by the Alzheimer's disease (AD) and healthy controls (CTRL) are becoming widespread prognostic tools for subjects with mild cognitive impairment (MCI). This study compares the performances of three classification methods based on support vector machines (SVMs), using as initial sets of brain voxels (ie, features): (1) the segmented grey matter (GM); (2) regions of interest (ROIs) by voxel-wise t-test filtering; (3) parceled ROIs, according to prior knowledge. The recursive feature elimination (RFE) is applied in all cases to investigate whether feature reduction improves the classification accuracy. We analyzed more than 600 AD Neuroimaging Initiative (ADNI) subjects, training the SVMs on the AD/CTRL dataset, and evaluating them on a trial MCI dataset. The classification performance, evaluated as the area under the receiver operating characteristic curve (AUC), reaches AUC = (88.9 ± .5)% in 20-fold cross-validation on the AD/CTRL dataset, when the GM is classified as a whole. The highest discrimination accuracy between MCI converters and nonconverters is achieved when the SVM-RFE is applied to the whole GM: with AUC reaching (70.7 ± .9)%, it outperforms both ROI-based approaches in predicting the AD conversion
    corecore