16 research outputs found

    Enabling parallel and interactive distributed computing data analysis for the ALICE experiment

    Get PDF
    AliEn (ALICE Environment) is the production environment developed by the ALICE collaboration at CERN. It provides a set of Grid tools enabling the full offline computational work-flow of the experiment (simulation, reconstruction and data analysis) in a distributed and heterogeneous computing environment. In addition to the analysis on the Grid, ALICE users perform local interactive analysis using ROOT and the Parallel ROOT Facility (PROOF). PROOF enables physicists to analyse in parallel medium-sized (200-300 TB) data sets in a short time scale. The default installation of PROOF is on a static dedicated cluster, typically 200-300 cores. This well-proven approach is not devoid of limitations, more specifically for analysis of larger datasets or when the installation of a dedicated cluster is not possible. Using a new framework called Proof on Demand (PoD), PROOF can be used directly on Grid-enabled clusters, by dynamically assigning interactive nodes on user request. This thesis presents the PoD on AliEn project. The integration of Proof on Demand in the AliEn framework provides private dynamic PROOF clusters as a Grid service. This functionality is transparent to the user who will submit interactive jobs to the AliEn system. The ROOT framework, among other things, is used by physicists to carry out the Monte Carlo Simulation of the detector. The engineers working on the mechanical design of the detector need to collaborate with the physicists. However, the softwares used by the engineers are not compatible with ROOT. This thesis describes a second result obtained during this PhD project: the implementation of the TGeoCad Interface that allows the conversion of ROOT geometries to STEP format, compatible with CAD systems. The interface provides an important communication and collaboration tool between physicists and engineers, dealing with the simulation and the design of the detector geometry

    Online Analysis of Dynamic Streaming Data

    Get PDF
    Die Arbeit zum Thema "Online Analysis of Dynamic Streaming Data" beschäftigt sich mit der Distanzmessung dynamischer, semistrukturierter Daten in kontinuierlichen Datenströmen um Analysen auf diesen Datenstrukturen bereits zur Laufzeit zu ermöglichen. Hierzu wird eine Formalisierung zur Distanzberechnung für statische und dynamische Bäume eingeführt und durch eine explizite Betrachtung der Dynamik von Attributen einzelner Knoten der Bäume ergänzt. Die Echtzeitanalyse basierend auf der Distanzmessung wird durch ein dichte-basiertes Clustering ergänzt, um eine Anwendung des Clustering, einer Klassifikation, aber auch einer Anomalieerkennung zu demonstrieren. Die Ergebnisse dieser Arbeit basieren auf einer theoretischen Analyse der eingeführten Formalisierung von Distanzmessungen für dynamische Bäume. Diese Analysen werden unterlegt mit empirischen Messungen auf Basis von Monitoring-Daten von Batchjobs aus dem Batchsystem des GridKa Daten- und Rechenzentrums. Die Evaluation der vorgeschlagenen Formalisierung sowie der darauf aufbauenden Echtzeitanalysemethoden zeigen die Effizienz und Skalierbarkeit des Verfahrens. Zudem wird gezeigt, dass die Betrachtung von Attributen und Attribut-Statistiken von besonderer Bedeutung für die Qualität der Ergebnisse von Analysen dynamischer, semistrukturierter Daten ist. Außerdem zeigt die Evaluation, dass die Qualität der Ergebnisse durch eine unabhängige Kombination mehrerer Distanzen weiter verbessert werden kann. Insbesondere wird durch die Ergebnisse dieser Arbeit die Analyse sich über die Zeit verändernder Daten ermöglicht

    Contributions to the study of Austism Spectrum Brain conectivity

    Get PDF
    164 p.Autism Spectrum Disorder (ASD) is a largely prevalent neurodevelopmental condition with a big social and economical impact affecting the entire life of families. There is an intense search for biomarkers that can be assessed as early as possible in order to initiate treatment and preparation of the family to deal with the challenges imposed by the condition. Brain imaging biomarkers have special interest. Specifically, functional connectivity data extracted from resting state functional magnetic resonance imaging (rs-fMRI) should allow to detect brain connectivity alterations. Machine learning pipelines encompass the estimation of the functional connectivity matrix from brain parcellations, feature extraction and building classification models for ASD prediction. The works reported in the literature are very heterogeneous from the computational and methodological point of view. In this Thesis we carry out a comprehensive computational exploration of the impact of the choices involved while building these machine learning pipelines

    Big data-driven multimodal traffic management : trends and challenges

    Get PDF

    Machine learning for particle identification & deep generative models towards fast simulations for the Alice Transition Radiation Detector at CERN

    Get PDF
    This Masters thesis outlines the application of machine learning techniques, predominantly deep learning techniques, towards certain aspects of particle physics. Its two main aims: particle identification and high energy physics detector simulations are pertinent to research avenues pursued by physicists working with the ALICE (A Large Ion Collider Experiment) Transition Radiation Detector (TRD), within the Large Hadron Collider (LHC) at CERN (The European Organization for Nuclear Research)

    Using web 2.0 technologies to facilitate the collaborative design process among undergraduate engineering students: an actor network study.

    Get PDF
    Doctor of Philosophy in Education. University of KwaZulu-Natal, Durban 2016.In this thesis I am motivated by a keen interest in design collaboration, and a belief that the quality of design interactions could be enhanced by employing a repertoire of the new and emerging collaborative technologies in the design process. In this study I employed actor network theory’s (ANT’s) methodological and theoretical framework to investigate the use of Web 2.0-facilitated collaborative design by Industrial and Manufacturing Engineering students at the Harare Institute of Technology. In line with ANT, I traced the collaborative design process by following the actors in action (Latour, 2005) when the forces of the network were at work, picking up the traces they left behind to constitute the empirical data for the study. By employing ANT analytical tools the data of the network-tracing activity reveals that the Web 2.0-facilitated collaborative process is an emergent actor network that evolves from associations created among the actors as they negotiate the alignment of interests through a series of translations that occur through moments of problematisation, interessement, enrolment and mobilisation (Callon, 1986b). As the actors went through the moments of translation, various interpretations of the design problem were translated into technical solutions and procedures to be followed in search of a satisfying design solution. The process of achieving agreement (or a stable network) is dependent on the translations that take place among the actors. The analysis shows that Web 2.0-facilitated collaborative design is an emergent process. It is a process that evolves from a translation process, during which a hodgepodge of decisions that cannot wait are taken in a complex, dynamic, fluid and constantly changing environment where actions cannot be planned or predicted in any mechanical way (Akrich, Collan, Latour, & Monaghan, 2002). Therefore, the path that the design process takes cannot be predetermined, but emerges from the network of relations that are created by the actors as they work together to achieve their commonly agreed design goals. Considering the Web 2.0-facilitated collaborative as an emergent process clearly demonstrated that it does not take place in a step by step way, as depicted by many design models. Instead, the process moves back and forth between different domains as the design problem and solution co-evolve and are continuously up for revision (Downey, 2005; Petersen, 2013). The affordances of Web 2.0 technology supported the messy talk (Iorio, Peschiera, & Taylor, 2011) that was critical to the development of design solutions. The emergent character of Web 2.0-facilitated collaborative design allows for important theoretical and practical lessons for design educators, to improve the teaching and learning of the collaborative design process. With collaborative design as an emergent process, it is no longer methods alone that produce results, but the reassemblage of the totality of translation that takes place among the actors into a stable network of relationships, and it cannot be taught outside of authentic design projects

    Sustainability science for water: Bibliometric analysis and social-ecological resilience thinking

    Get PDF
    Incorporating resilience thinking, the thesis uses bibliometric analysis to identify emerging approaches and technology in sustainability science for water resources. It conducts a resilience-based assessment of a Chinese freshwater lake exposed to the disturbances and impacts of the Three Gorges Dam. A new resilience assessment framework is developed and applied in the case of Dongting Lake. Social networks for community resilience to the environmental changes triggered by the dam are also explored

    Electromagnetic Side-Channel Resilience against Lightweight Cryptography

    Get PDF
    Side-channel attacks are an unpredictable risk factor in cryptography. Therefore, observations of leakages through physical parameters, i.e., power and electromagnetic (EM) radiation, etc., of digital devices are essential to minimise vulnerabilities associated with cryptographic functions. Compared to costs in the past, performing side-channel attacks using inexpensive test equipment is becoming a reality. Internet-of-Things (IoT) devices are resource-constrained, and lightweight cryptography is a novel approach in progress towards IoT security. Thus, it would provide sufficient data and privacy protection in such a constrained ecosystem. Therefore, cryptanalysis of physical leakages regarding these emerging ciphers is crucial. EM side-channel attacks seem to cause a significant impact on digital forensics nowadays. Within existing literature, power analysis seems to have considerable attention in research whereas other phenomena, such as EM, should continue to be appropriately evaluated in playing a role in forensic analysis.The emphasis of this thesis is on lightweight cryptanalysis. The preliminary investigations showed no Correlation EManalysis (CEMA) of PRESENT lightweight algorithm. The PRESENT is a block cipher that promises to be adequate for IoT devices, and is expected to be used commercially in the future. In an effort to fill in this research gap, this work examines the capabilities of a correlation EM side-channel attack against the PRESENT. For that, Substitution box (S-box) of the PRESENT was targeted for its 1st round with the use of a minimum number of EM waveforms compared to other work in literature, which was 256. The attack indicates the possibility of retrieving 8 bytes of the secret key out of 10 bytes. The experimental process started from a Simple EMA (SEMA) and gradually enhanced up to a CEMA. The thesis presents the methodology of the attack modelling and the observations followed by a critical analysis. Also, a technical review of the IoT technology and a comprehensive literature review on lightweight cryptology are included
    corecore