24 research outputs found

    High resolution geomatics techniques for coastline detection and monitoring: Boccasette and Barricata case studies (Po River Delta, Rovigo, Italy).

    Get PDF
    The topic studied in this thesis aims to analyze an experimental approach for the definition of a methodology that allows the identification of the coastline, or instantaneous coastline, perceived as a land-sea-air interface. As we will see its geographical boundaries are not easily defined in a universal way, we will see through the comparison between three different methodologies, including the GNSS system the classical topography and the methodology of photogrammetric survey Sfm (structure from motion), the difficulties encountered in defining an extension to land and sea that delimits the coastline in a unique way. The treated area concerns areas on the Po delta, specifically the beaches of Boccasette and Barricata. In the chapters will be addressed the various aspects, first analyzing the context of reference, identifying the location, with an introduction to the coastal reality in question, characterized by phenomena such as subsidence, a worrying phenomenon for the Po delta area, monitored by several studies over the years. The various methodologies applied in the context will be explained, exposing the characteristics of the same to understand their potential in the case study, then passing to the software used to process data and obtain the output. The data obtained with the various methodologies have been compared with each other in GIS thanks to the potential that the software offers, proving practical and fast. The approach of the study mainly uses the distance and the area between the segments joining the points detected, evaluating the differences in terms of mean and standard deviation, to establish, also on the basis of the operator’s interpretation, which of the following methodologies is the most practical, precise and fast for monitoring purposes. We then went on to compare and discuss the results obtained with the various methods, highlighting the more or less significant variations. As we will see, there is a certain difference between the methodologies used directly in the field, by measuring the physical points and the Sfm technique, influenced as we will see by many factors. The coastline at zero level obtained through a DTM (Digital Terrain Model) was subsequently superimposed and compared with the coastline extracted from a LiDAR survey performed in 2018. This second activity allowed, through appropriate processing, a four-year multi-time analysis of coastline changes and the identification and classification of areas of expansion and erosion. The research thus made it possible to: - A comparison between the different methodologies comparing their applicability and limits; - Evaluate and define the most appropriate relevant technique to study and identify the coastline; - Define the accuracy parameters for modelling the detected elements; - The creation of a database for possible future comparisons of coastal variations; - The possibility of conducting a multi-temporal analysis with available LiDAR data.The topic studied in this thesis aims to analyze an experimental approach for the definition of a methodology that allows the identification of the coastline, or instantaneous coastline, perceived as a land-sea-air interface. As we will see its geographical boundaries are not easily defined in a universal way, we will see through the comparison between three different methodologies, including the GNSS system the classical topography and the methodology of photogrammetric survey Sfm (structure from motion), the difficulties encountered in defining an extension to land and sea that delimits the coastline in a unique way. The treated area concerns areas on the Po delta, specifically the beaches of Boccasette and Barricata. In the chapters will be addressed the various aspects, first analyzing the context of reference, identifying the location, with an introduction to the coastal reality in question, characterized by phenomena such as subsidence, a worrying phenomenon for the Po delta area, monitored by several studies over the years. The various methodologies applied in the context will be explained, exposing the characteristics of the same to understand their potential in the case study, then passing to the software used to process data and obtain the output. The data obtained with the various methodologies have been compared with each other in GIS thanks to the potential that the software offers, proving practical and fast. The approach of the study mainly uses the distance and the area between the segments joining the points detected, evaluating the differences in terms of mean and standard deviation, to establish, also on the basis of the operator’s interpretation, which of the following methodologies is the most practical, precise and fast for monitoring purposes. We then went on to compare and discuss the results obtained with the various methods, highlighting the more or less significant variations. As we will see, there is a certain difference between the methodologies used directly in the field, by measuring the physical points and the Sfm technique, influenced as we will see by many factors. The coastline at zero level obtained through a DTM (Digital Terrain Model) was subsequently superimposed and compared with the coastline extracted from a LiDAR survey performed in 2018. This second activity allowed, through appropriate processing, a four-year multi-time analysis of coastline changes and the identification and classification of areas of expansion and erosion. The research thus made it possible to: - A comparison between the different methodologies comparing their applicability and limits; - Evaluate and define the most appropriate relevant technique to study and identify the coastline; - Define the accuracy parameters for modelling the detected elements; - The creation of a database for possible future comparisons of coastal variations; - The possibility of conducting a multi-temporal analysis with available LiDAR data

    Design and Testing of Electronic Devices for Harsh Environments

    Get PDF
    In this thesis an overview of the research activity focused on development, design and testing of electronic devices and systems for harsh environments has been reported. The scope of the work has been the design and validation flow of Integrated Circuits operating in two harsh applications: Automotive and High Energy Physics experiments. In order to fulfill the severe operating electrical and environmental conditions of automotive applications, a systematic methodology has been followed in the design of an innovative Intelligent Power Switch: several design solutions have been developed at architectural and circuital level, integrating on-chip selfdiagnostic capabilities and full protection against high voltage and reverse polarity, effects of wiring parasitics, over-current and over-temperature phenomena. Moreover current slope and soft start integrated techniques has ensured low EMI, making the Intelligent Power Switch also configurable to drive different interchangeable loads efficiently. The innovative device proposed has been implemented in a 0.35 μm HV-CMOS technology and embedded in mechatronic 3rd generation brush-holder regulator System-on-Chip for an automotive alternator. Electrical simulations and experimental characterization and testing at componentlevel and on-board system-level has proven that the proposed design allows for a compact and smart power switch realization, facing the harshest automotive conditions. The smart driver has been able to supply up to 1.5 A to various types of loads (e.g.: incadescent lamp bulbs, LED), in operating temperatures in the wide range -40 °C to 150 °C, with robustness against high voltage up to 55 V and reverse polarity up to -15 V. The second branch of research activity has been framed within the High Energy Physics area, leading to the development of a general purpose and flexible protocol for the data acquisition and the distribution of Timing, Trigger and Control signals and its implementation in radiation tolerant interfaces in CMOS 130 nm technology. The several features integrated in the protocol has made it suitable for different High Energy Physics experiments: flexibility w.r.t. bandwidth and latency requirements, robustness of critical information against radiation-induced errors, compatibility with different data types, flexibility w.r.t the architecture of the control and readout systems, are the key features of this novel protocol. Innovative radiation hardening techniques have been studied and implemented in the test-chip to ensure the proper functioning in operating environments with a high level of radiation, such as the Large Hadron Collider at CERN in Geneva. An FPGA-based emulator has been developed and, in a first phase, employed for functional validation of the protocol. In a second step, the emulator has been modified as test-bed to assess the Transmitter and Receiver interfaces embedded on the test-chip. An extensive phase of tests has proven the functioning of the interfaces at the three speed options, 4xF, 8xF and 16xF (F = reference clock frequency) in different configurations. Finally, irradiation tests has been performed at CERN X-rays irradiation facility, bearing out the proper behaviour of the interfaces up to 40 Mrad(SiO2)

    The annotation of continuous media

    Get PDF
    In principle, the presentation of continuous media is time-dependent. Examples of con­tinuous media are audio, video and graphics animation. This work is on the support for the annotation of continuous media, or the integration of voice comments with continuous- media documents like music and video clips. This application has strict synchronisation requirements, both with respect to the media involved and to user interaction. The applica­tion involves functions such as storage, management, control of GUIs, and of continuous- medium devices. These are realised by components which can be distributed across a network. New models and architectures have been defined to enable open distributed processing of applications, that is, distributed processing independent of operating systems. Abstractions are provided, which facilitate the development of applications, and these execute supported by platforms that implement such open architectures. These architectures have been based on an object-based client/server model. Our work aims at exploring object-orientation, open distributed processing and some characteristics of continuous media, through the development and use of the proposed application. The application is designed as a set of objects with well-defined functions and which interact between themselves. A distinguishing feature of the application is that it involves reusable components and mechanisms. For example, a mechanism, which enables components to control logical clocks and synchronise them, is incorporated in the application in response to its synchronisation requirements. The implementation is based on ANSAware, a platform that supports open distributed processing and allows distributed objects to bind to each other, to interact with one another, and to exhibit concurrent activities. The performance of the implementation is examined with respect to the application’s response to user requests. Response times of operations such as play, pause, etc., are measured, and the final results are better than a defined maximum tolerance. An analysis of the development approach is made with respect to support for real-time activities in the application, and to software reuse in the model proposed. This thesis concludes by reviewing the suitability of the object-oriented approach for the development of distributed continuous media applications

    Component-based software engineering: a quantitative approach

    Get PDF
    Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e TecnologiaBackground: Often, claims in Component-Based Development (CBD) are only supported by qualitative expert opinion, rather than by quantitative data. This contrasts with the normal practice in other sciences, where a sound experimental validation of claims is standard practice. Experimental Software Engineering (ESE) aims to bridge this gap. Unfortunately, it is common to find experimental validation efforts that are hard to replicate and compare, to build up the body of knowledge in CBD. Objectives: In this dissertation our goals are (i) to contribute to evolution of ESE, in what concerns the replicability and comparability of experimental work, and (ii) to apply our proposals to CBD, thus contributing to its deeper and sounder understanding. Techniques: We propose a process model for ESE, aligned with current experimental best practices, and combine this model with a measurement technique called Ontology-Driven Measurement (ODM). ODM is aimed at improving the state of practice in metrics definition and collection, by making metrics definitions formal and executable,without sacrificing their usability. ODM uses standard technologies that can be well adapted to current integrated development environments. Results: Our contributions include the definition and preliminary validation of a process model for ESE and the proposal of ODM for supporting metrics definition and collection in the context of CBD. We use both the process model and ODM to perform a series experimental works in CBD, including the cross-validation of a component metrics set for JavaBeans, a case study on the influence of practitioners expertise in a sub-process of component development (component code inspections), and an observational study on reusability patterns of pluggable components (Eclipse plug-ins). These experimental works implied proposing, adapting, or selecting adequate ontologies, as well as the formal definition of metrics upon each of those ontologies. Limitations: Although our experimental work covers a variety of component models and, orthogonally, both process and product, the plethora of opportunities for using our quantitative approach to CBD is far from exhausted. Conclusions: The main contribution of this dissertation is the illustration, through practical examples, of how we can combine our experimental process model with ODM to support the experimental validation of claims in the context of CBD, in a repeatable and comparable way. In addition, the techniques proposed in this dissertation are generic and can be applied to other software development paradigms.Departamento de Informática of the Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa (FCT/UNL); Centro de Informática e Tecnologias da Informação of the FCT/UNL; Fundação para a Ciência e Tecnologia through the STACOS project(POSI/CHS/48875/2002); The Experimental Software Engineering Network (ESERNET);Association Internationale pour les Technologies Objets (AITO); Association forComputing Machinery (ACM

    Ricardo Felipe Custódio

    Get PDF

    A facility to Search for Hidden Particles (SHiP) at the CERN SPS

    Get PDF
    A new general purpose fixed target facility is proposed at the CERN SPS accelerator which is aimed at exploring the domain of hidden particles and make measurements with tau neutrinos. Hidden particles are predicted by a large number of models beyond the Standard Model. The high intensity of the SPS 400~GeV beam allows probing a wide variety of models containing light long-lived exotic particles with masses below O{\cal O}(10)~GeV/c2^2, including very weakly interacting low-energy SUSY states. The experimental programme of the proposed facility is capable of being extended in the future, e.g. to include direct searches for Dark Matter and Lepton Flavour Violation.Comment: Technical Proposa

    Topical Workshop on Electronics for Particle Physics

    Get PDF
    The purpose of the workshop was to present results and original concepts for electronics research and development relevant to particle physics experiments as well as accelerator and beam instrumentation at future facilities; to review the status of electronics for the LHC experiments; to identify and encourage common efforts for the development of electronics; and to promote information exchange and collaboration in the relevant engineering and physics communities

    Proceedings of the 20th International Conference on Multimedia in Physics Teaching and Learning

    Get PDF
    corecore