4,683 research outputs found

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    Reliable Sensor Intelligence in Resource Constrained and Unreliable Environment

    Get PDF
    The objective of this research is to design a sensor intelligence that is reliable in a resource constrained, unreliable environment. There are various sources of variations and uncertainty involved in intelligent sensor system, so it is critical to build reliable sensor intelligence. Many prior works seek to design reliable sensor intelligence by developing robust and reliable task. This thesis suggests that along with improving task itself, task reliability quantification based early warning can further improve sensor intelligence. DNN based early warning generator quantifies task reliability based on spatiotemporal characteristics of input, and the early warning controls sensor parameters and avoids system failure. This thesis presents an early warning generator that predicts task failure due to sensor hardware induced input corruption and controls the sensor operation. Moreover, lightweight uncertainty estimator is presented to take account of DNN model uncertainty in task reliability quantification without prohibitive computation from stochastic DNN. Cross-layer uncertainty estimation is also discussed to consider the effect of PIM variations.Ph.D

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    Guided rewriting and constraint satisfaction for parallel GPU code generation

    Get PDF
    Graphics Processing Units (GPUs) are notoriously hard to optimise for manually due to their scheduling and memory hierarchies. What is needed are good automatic code generators and optimisers for such parallel hardware. Functional approaches such as Accelerate, Futhark and LIFT leverage a high-level algorithmic Intermediate Representation (IR) to expose parallelism and abstract the implementation details away from the user. However, producing efficient code for a given accelerator remains challenging. Existing code generators depend on the user input to choose a subset of hard-coded optimizations or automated exploration of implementation search space. The former suffers from the lack of extensibility, while the latter is too costly due to the size of the search space. A hybrid approach is needed, where a space of valid implementations is built automatically and explored with the aid of human expertise. This thesis presents a solution combining user-guided rewriting and automatically generated constraints to produce high-performance code. The first contribution is an automatic tuning technique to find a balance between performance and memory consumption. Leveraging its functional patterns, the LIFT compiler is empowered to infer tuning constraints and limit the search to valid tuning combinations only. Next, the thesis reframes parallelisation as a constraint satisfaction problem. Parallelisation constraints are extracted automatically from the input expression, and a solver is used to identify valid rewriting. The constraints truncate the search space to valid parallel mappings only by capturing the scheduling restrictions of the GPU in the context of a given program. A synchronisation barrier insertion technique is proposed to prevent data races and improve the efficiency of the generated parallel mappings. The final contribution of this thesis is the guided rewriting method, where the user encodes a design space of structural transformations using high-level IR nodes called rewrite points. These strongly typed pragmas express macro rewrites and expose design choices as explorable parameters. The thesis proposes a small set of reusable rewrite points to achieve tiling, cache locality, data reuse and memory optimisation. A comparison with the vendor-provided handwritten kernel ARM Compute Library and the TVM code generator demonstrates the effectiveness of this thesis' contributions. With convolution as a use case, LIFT-generated direct and GEMM-based convolution implementations are shown to perform on par with the state-of-the-art solutions on a mobile GPU. Overall, this thesis demonstrates that a functional IR yields well to user-guided and automatic rewriting for high-performance code generation

    La traduzione specializzata all’opera per una piccola impresa in espansione: la mia esperienza di internazionalizzazione in cinese di Bioretics© S.r.l.

    Get PDF
    Global markets are currently immersed in two all-encompassing and unstoppable processes: internationalization and globalization. While the former pushes companies to look beyond the borders of their country of origin to forge relationships with foreign trading partners, the latter fosters the standardization in all countries, by reducing spatiotemporal distances and breaking down geographical, political, economic and socio-cultural barriers. In recent decades, another domain has appeared to propel these unifying drives: Artificial Intelligence, together with its high technologies aiming to implement human cognitive abilities in machinery. The “Language Toolkit – Le lingue straniere al servizio dell’internazionalizzazione dell’impresa” project, promoted by the Department of Interpreting and Translation (Forlì Campus) in collaboration with the Romagna Chamber of Commerce (Forlì-Cesena and Rimini), seeks to help Italian SMEs make their way into the global market. It is precisely within this project that this dissertation has been conceived. Indeed, its purpose is to present the translation and localization project from English into Chinese of a series of texts produced by Bioretics© S.r.l.: an investor deck, the company website and part of the installation and use manual of the Aliquis© framework software, its flagship product. This dissertation is structured as follows: Chapter 1 presents the project and the company in detail; Chapter 2 outlines the internationalization and globalization processes and the Artificial Intelligence market both in Italy and in China; Chapter 3 provides the theoretical foundations for every aspect related to Specialized Translation, including website localization; Chapter 4 describes the resources and tools used to perform the translations; Chapter 5 proposes an analysis of the source texts; Chapter 6 is a commentary on translation strategies and choices

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    Medium Voltage Solid-State Transformer:An IEC60076-3 based design

    Get PDF

    Growth of Group IV and III-V Semiconductor Materials for Silicon Photonics: Buffer Layer and Light Source Development

    Get PDF
    High data transmission speeds, high levels of integration, and low manufacturing costs have established Si photonics as a crucial technology for next-generation data interconnects and communications systems. It involves a variety of components including light emitters, photodetectors, amplifiers, waveguides, modulators, and more. Because of its indirect bandgap, silicon is unable to serve as an efficient light source on a chip, hence this has been one of the formidable challenges. Within the framework of the monolithic approach, this thesis presents the study of two essential aspects of this challenge, the optimisation of buffer layers and development of light sources, by incorporating and improving different systems of Group IV thin films and III-V quantum dots (QDs) semiconductor materials. The monolithic approach focuses on the direct epitaxial growth of highly efficient light sources, usually by the epitaxy of III-V semiconductors lasers on a single Si chip. However, because of the material dissimilarities between III-V materials and Si, during the heteroepitaxy, a high density of crystalline defects such as threading dislocations (TDs), thermal cracks and anti-phase domains are introduced, severely impeding the performance and yield of the laser. For instance, TDs act as non-radiative recombination centres, while thermal cracks cause issues with the efficient evanescent coupling of the emitted light with Si waveguide. To address these defects, typically complex buffer growth techniques with micron-scale thickness are employed. The research in this thesis is divided into two parts, namely buffer layer optimisation and light source development. Each part outlines alternative strategies for overcoming the above-mentioned hurdles for monolithic growth. The first part highlights the optimisation of buffer layer growth to reduce threading dislocations for the monolithic integration of high-performance direct-bandgap III-V and group IV light sources on Si. The growth optimisation of low defect-density Ge buffer layers epitaxially grown on Si was first investigated. Defect elimination in Ge buffers with doped and undoped seed layers of increasing total thickness is studied under a variety of growth regimes, doping techniques, and annealing processes. This study demonstrates that a 500 nm thin Ge achieves the same defect level (1.3 × 108 cm -2) as 2.2 μm GaAs grown on Si, which greatly increases the thickness budget for the subsequent dislocation filter layers (DFLs) and laser structure growth before the formation of thermal cracks. Meanwhile, a low threading dislocation density of 3.3 × 107 cm -2 is obtained for 1 μm Ge grown on Si. The second part places emphasis on the development of light sources in the near-infrared wavelength range for Si photonics. 1) The development of GeSn, an emerging direct bandgap light source for Si photonics, is shown, which has wide bandgap tuneability and full compatibility with Si complementary metal-oxide semiconductor (CMOS). Growing the high Sn composition of GeSn required for efficient light generation is challenging and its growth generally severely affected by large surface roughness and Sn segregation. In this work, first, ex-situ rapid thermal annealing for the grown GeSn layer is investigated, showing that by proper annealing the strain can be relaxed by 90% without intriguing Sn segregation. This method shows its potential for both material growth and device fabrication. Besides, strain compensated layer and in-situ annealing techniques have been developed. Significantly improved surface quality has been confirmed by in-situ reflection high-energy electron diffraction (RHEED) observations and atomic force microscopy (AFM) images. Transmission electron microscopy (TEM) results reveal the high crystal quality of the multiple quantum wells (MQWs) grown on such buffer layers. 2) The final section details the development of InAs/InP QDs emitting near the strategic 1.55 μm, the lowest optical fibre loss window. The InAs/InP QDs growth is prone to inhomogeneous quantum dash morphologies which broaden the photoluminescence (PL) spectra and degrade the carrier confinement. Research has been conducted on growth parameters and techniques including deposition thickness, growth temperature and Indium-flush technique is applied to improve the uniformity of the dots, and narrow room temperature PL linewidths of 47.9 meV and 50.9 meV have been achieved for single-layer and five-layer quantum dot samples, respectively. The structures enable the fabrication of small footprint microdisk lasers with lasing thresholds as low as 30 μW

    ICEBEAR-3D: An Advanced Low Elevation Angle Auroral E region Imaging Radar

    Get PDF
    The Ionospheric Continuous-wave E region Bistatic Experimental Auroral Radar (ICEBEAR) is an auroral E~region radar which has operated from 7 December 2017 until the September 2019. During the first two years of operation, ICEBEAR was only capable of spatially locating E~region scatter and meteor trail targets in range and azimuth. Elevation angles were not determinable due to its East-West uniform linear receiving antenna array. Measuring elevation angles of targets when viewing from low elevation angles with radar interferometers has been a long standing problem. Past high latitude radars have attempted to obtain elevation angles of E~region targets using North-South baselines, but have always resulted in erroneous elevation angles being measured in the low elevation regime (0° to ≈30° above the horizon), leaving interesting scientific questions about scatter altitudes in the auroral E~region unanswered. The work entailed in this thesis encompasses the design of the ICEBEAR-3D system for the acquisition of these important elevation angles. The receiver antenna array was redesigned using a custom phase error minimization and stochastic antenna location perturbation technique, which produces phase tolerant receiver antenna arrays. The resulting 45-baseline sparse non-uniform coplanar T-shaped array was designed for aperture synthesis radar imaging. Conventional aperture synthesis radar imaging techniques assume point-like incoherent targets and image using a Cartesian basis over a narrow field of view. These methods are incompatible with horizon pointing E~region radars such as ICEBEAR. Instead, radar targets were imaged using the Suppressed Spherical Wave Harmonic Transform (Suppressed-SWHT) technique. This imaging method uses precalculated spherical harmonic coefficient matrices to transform the visibilities to brightness maps by direct matrix multiplication. The under sampled image domain artefacts (dirty beam) were suppressed by the products of differing harmonic order brightness maps. From the images, elevation and azimuth angles of arrival were obtained. Due to the excellent phase tolerance of ICEBEAR new light was shed on the long standing low elevation angle problem. This led to the development of the proper phase reference vertical interferometry geometry, which allowed horizon pointing radar interferometers to unambiguously measure elevation angles near the horizon. Ultimately resulting in accurate elevation angles from zenith to horizon
    corecore