1,444 research outputs found

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    Accurate quantum transport modelling and epitaxial structure design of high-speed and high-power In0.53Ga0.47As/AlAs double-barrier resonant tunnelling diodes for 300-GHz oscillator sources

    Get PDF
    Terahertz (THz) wave technology is envisioned as an appealing and conceivable solution in the context of several potential high-impact applications, including sixth generation (6G) and beyond consumer-oriented ultra-broadband multi-gigabit wireless data-links, as well as highresolution imaging, radar, and spectroscopy apparatuses employable in biomedicine, industrial processes, security/defence, and material science. Despite the technological challenges posed by the THz gap, recent scientific advancements suggest the practical viability of THz systems. However, the development of transmitters (Tx) and receivers (Rx) based on compact semiconductor devices operating at THz frequencies is urgently demanded to meet the performance requirements calling from emerging THz applications. Although several are the promising candidates, including high-speed III-V transistors and photo-diodes, resonant tunnelling diode (RTD) technology offers a compact and high performance option in many practical scenarios. However, the main weakness of the technology is currently represented by the low output power capability of RTD THz Tx, which is mainly caused by the underdeveloped and non-optimal device, as well as circuit, design implementation approaches. Indeed, indium phosphide (InP) RTD devices can nowadays deliver only up to around 1 mW of radio-frequency (RF) power at around 300 GHz. In the context of THz wireless data-links, this severely impacts the Tx performance, limiting communication distance and data transfer capabilities which, at the current time, are of the order of few tens of gigabit per second below around 1 m. However, recent research studies suggest that several milliwatt of output power are required to achieve bit-rate capabilities of several tens of gigabits per second and beyond, and to reach several metres of communication distance in common operating conditions. Currently, the shortterm target is set to 5−10 mW of output power at around 300 GHz carrier waves, which would allow bit-rates in excess of 100 Gb/s, as well as wireless communications well above 5 m distance, in first-stage short-range scenarios. In order to reach it, maximisation of the RTD highfrequency RF power capability is of utmost importance. Despite that, reliable epitaxial structure design approaches, as well as accurate physical-based numerical simulation tools, aimed at RF power maximisation in the 300 GHz-band are lacking at the current time. This work aims at proposing practical solutions to address the aforementioned issues. First, a physical-based simulation methodology was developed to accurately and reliably simulate the static current-voltage (IV ) characteristic of indium gallium arsenide/aluminium arsenide (In-GaAs/AlAs) double-barrier RTD devices. The approach relies on the non-equilibrium Green’s function (NEGF) formalism implemented in Silvaco Atlas technology computer-aided design (TCAD) simulation package, requires low computational budget, and allows to correctly model In0.53Ga0.47As/AlAs RTD devices, which are pseudomorphically-grown on lattice-matched to InP substrates, and are commonly employed in oscillators working at around 300 GHz. By selecting the appropriate physical models, and by retrieving the correct materials parameters, together with a suitable discretisation of the associated heterostructure spatial domain through finite-elements, it is shown, by comparing simulation data with experimental results, that the developed numerical approach can reliably compute several quantities of interest that characterise the DC IV curve negative differential resistance (NDR) region, including peak current, peak voltage, and voltage swing, all of which are key parameters in RTD oscillator design. The demonstrated simulation approach was then used to study the impact of epitaxial structure design parameters, including those characterising the double-barrier quantum well, as well as emitter and collector regions, on the electrical properties of the RTD device. In particular, a comprehensive simulation analysis was conducted, and the retrieved output trends discussed based on the heterostructure band diagram, transmission coefficient energy spectrum, charge distribution, and DC current-density voltage (JV) curve. General design guidelines aimed at enhancing the RTD device maximum RF power gain capability are then deduced and discussed. To validate the proposed epitaxial design approach, an In0.53Ga0.47As/AlAs double-barrier RTD epitaxial structure providing several milliwatt of RF power was designed by employing the developed simulation methodology, and experimentally-investigated through the microfabrication of RTD devices and subsequent high-frequency characterisation up to 110 GHz. The analysis, which included fabrication optimisation, reveals an expected RF power performance of up to around 5 mW and 10 mW at 300 GHz for 25 μm2 and 49 μm2-large RTD devices, respectively, which is up to five times higher compared to the current state-of-the-art. Finally, in order to prove the practical employability of the proposed RTDs in oscillator circuits realised employing low-cost photo-lithography, both coplanar waveguide and microstrip inductive stubs are designed through a full three-dimensional electromagnetic simulation analysis. In summary, this work makes and important contribution to the rapidly evolving field of THz RTD technology, and demonstrates the practical feasibility of 300-GHz high-power RTD devices realisation, which will underpin the future development of Tx systems capable of the power levels required in the forthcoming THz applications

    A review of technical factors to consider when designing neural networks for semantic segmentation of Earth Observation imagery

    Full text link
    Semantic segmentation (classification) of Earth Observation imagery is a crucial task in remote sensing. This paper presents a comprehensive review of technical factors to consider when designing neural networks for this purpose. The review focuses on Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Generative Adversarial Networks (GANs), and transformer models, discussing prominent design patterns for these ANN families and their implications for semantic segmentation. Common pre-processing techniques for ensuring optimal data preparation are also covered. These include methods for image normalization and chipping, as well as strategies for addressing data imbalance in training samples, and techniques for overcoming limited data, including augmentation techniques, transfer learning, and domain adaptation. By encompassing both the technical aspects of neural network design and the data-related considerations, this review provides researchers and practitioners with a comprehensive and up-to-date understanding of the factors involved in designing effective neural networks for semantic segmentation of Earth Observation imagery.Comment: 145 pages with 32 figure

    Investigating the potential for detecting Oak Decline using Unmanned Aerial Vehicle (UAV) Remote Sensing

    Get PDF
    This PhD project develops methods for the assessment of forest condition utilising modern remote sensing technologies, in particular optical imagery from unmanned aerial systems and with Structure from Motion photogrammetry. The research focuses on health threats to the UK’s native oak trees, specifically, Chronic Oak Decline (COD) and Acute Oak Decline (AOD). The data requirements and methods to identify these complex diseases are investigatedusing RGB and multispectral imagery with very high spatial resolution, as well as crown textural information. These image data are produced photogrammetrically from multitemporal unmanned aerial vehicle (UAV) flights, collected during different seasons to assess the influence of phenology on the ability to detect oak decline. Particular attention is given to the identification of declined oak health within the context of semi-natural forests and heterogenous stands. Semi-natural forest environments pose challenges regarding naturally occurring variability. The studies investigate the potential and practical implications of UAV remote sensing approaches for detection of oak decline under these conditions. COD is studied at Speculation Cannop, a section in the Forest of Dean, dominated by 200-year-old oaks, where decline symptoms have been present for the last decade. Monks Wood, a semi-natural woodland in Cambridgeshire, is the study site for AOD, where trees exhibit active decline symptoms. Field surveys at these sites are designed and carried out to produce highly-accurate differential GNSS positional information of symptomatic and control oak trees. This allows the UAV data to be related to COD or AOD symptoms and the validation of model predictions. Random Forest modelling is used to determine the explanatory value of remote sensing-derived metrics to distinguish trees affected by COD or AOD from control trees. Spectral and textural variables are extracted from the remote sensing data using an object-based approach, adopting circular plots around crown centres at individual tree level. Furthermore, acquired UAV imagery is applied to generate a species distribution map, improving on the number of detectable species and spatial resolution from a previous classification using multispectral data from a piloted aircraft. In the production of the map, parameters relevant for classification accuracy, and identification of oak in particular, are assessed. The effect of plot size, sample size and data combinations are studied. With optimised parameters for species classification, the updated species map is subsequently employed to perform a wall-to-wall prediction of individual oak tree condition, evaluating the potential of a full inventory detection of declined health. UAV-acquired data showed potential for discrimination of control trees and declined trees, in the case of COD and AOD. The greatest potential for detecting declined oak condition was demonstrated with narrowband multispectral imagery. Broadband RGB imagery was determined to be unsuitable for a robust distinction between declined and control trees. The greatest explanatory power was found in remotely-sensed spectra related to photosynthetic activity, indicated by the high feature importance of nearinfrared spectra and the vegetation indices NDRE and NDVI. High feature importance was also produced by texture metrics, that describe structural variations within the crown. The findings indicate that the remotely sensed explanatory variables hold significant information regarding changes in leaf chemistry and crown morphology that relate to chlorosis, defoliation and dieback occurring in the course of the decline. In the case of COD, a distinction of symptomatic from control trees was achieved with 75 % accuracy. Models developed for AOD detection yielded AUC scores up to 0.98,when validated on independent sample data. Classification of oak presence was achieved with a User’s accuracy of 97 % and the produced species map generated 95 % overall accuracy across the eight species within the study area in the north-east of Monks Wood. Despite these encouraging results, it was shown that the generalisation of models is unfeasible at this stage and many challenges remain. A wall-to-wall prediction of decline status confirmed the inability to generalise, yielding unrealistic results, with a high number of declined trees predicted. Identified weaknesses of the developed models indicate complexity related to the natural variability of heterogenous forests combined with the diverse symptoms of oak decline. Specific to the presented studies, additional limitations were attributed to limited ground truth, consequent overfitting,the binary classification of oak health status and uncertainty in UAV-acquired reflectance values. Suggestions for future work are given and involve the extension of field sampling with a non-binary dependent variable to reflect the severity of oak decline induced stress. Further technical research on the quality and reliability of UAV remote sensing data is also required

    The European Experience: A Multi-Perspective History of Modern Europe, 1500–2000

    Get PDF
    The European Experience brings together the expertise of nearly a hundred historians from eight European universities to internationalise and diversify the study of modern European history, exploring a grand sweep of time from 1500 to 2000. Offering a valuable corrective to the Anglocentric narratives of previous English-language textbooks, scholars from all over Europe have pooled their knowledge on comparative themes such as identities, cultural encounters, power and citizenship, and economic development to reflect the complexity and heterogeneous nature of the European experience. Rather than another grand narrative, the international author teams offer a multifaceted and rich perspective on the history of the continent of the past 500 years. Each major theme is dissected through three chronological sub-chapters, revealing how major social, political and historical trends manifested themselves in different European settings during the early modern (1500–1800), modern (1800–1900) and contemporary period (1900–2000). This resource is of utmost relevance to today’s history students in the light of ongoing internationalisation strategies for higher education curricula, as it delivers one of the first multi-perspective and truly ‘European’ analyses of the continent’s past. Beyond the provision of historical content, this textbook equips students with the intellectual tools to interrogate prevailing accounts of European history, and enables them to seek out additional perspectives in a bid to further enrich the discipline

    An Efficient Federated Learning Method Enables Larger Local Intervals

    Get PDF
    Federated learning is an emerging distributed machine learning framework that jointly trains a global model via a large number of local devices with data privacy protections. Its performance suffers from the non-vanishing biases introduced by the local inconsistent optimal and the rugged client-drifts by the local over-fitting. In this thesis, we propose two novel and practical methods, FedSpeed and its variant FedSpeed-Ing, to alleviate the negative impacts posed by these problems. Concretely, FedSpeed applies the prox-correction term on the current local updates to efficiently reduce the biases introduced by the prox-term, a necessary regularizer to maintain strong local consistency. Furthermore, FedSpeed merges the vanilla stochastic gradient with a perturbation computed from an extra gradient ascent step in the neighborhood, thereby alleviating the issue of local over-fitting. Then, we introduce two inertial momenta on the global update as the FedSpeed-Ing method, which could further improve the optimization speed. Our theoretical analysis indicates that the convergence rate is related to both the communication rounds T and local intervals K with an upper bound O(1/T) if setting a proper local interval. Moreover, we conduct extensive experiments on the real-world dataset to demonstrate the efficiency of the proposed FedSpeed, which performs significantly faster and achieves the state-of-the-art (SOTA) performance on the general FL experimental settings than several baselines including FedAvg, FedProx, FedCM, FedAdam, SCAFFOLD, FedDyn, FedADMM, etc

    Visual place recognition for improved open and uncertain navigation

    Get PDF
    Visual place recognition localises a query place image by comparing it against a reference database of known place images, a fundamental element of robotic navigation. Recent work focuses on using deep learning to learn image descriptors for this task that are invariant to appearance changes from dynamic lighting, weather and seasonal conditions. However, these descriptors: require greater computational resources than are available on robotic hardware, have few SLAM frameworks designed to utilise them, return a relative comparison between image descriptors which is difficult to interpret, cannot be used for appearance invariance in other navigation tasks such as scene classification and are unable to identify query images from an open environment that have no true match in the reference database. This thesis addresses these challenges with three contributions. The first is a lightweight visual place recognition descriptor combined with a probabilistic filter to address a subset of the visual SLAM problem in real-time. The second contribution combines visual place recognition and scene classification for appearance invariant scene classification, which is extended to recognise unknown scene classes when navigating an open environment. The final contribution uses comparisons between query and reference image descriptors to classify whether they result in a true, or false positive localisation and whether a true match for the query image exists in the reference database.Edinburgh Centre for Robotics and Engineering and Physical Sciences Research Council (EPSRC) fundin

    Connected World:Insights from 100 academics on how to build better connections

    Get PDF

    Book of Abstracts:9th International Conference on Smart Energy Systems

    Get PDF

    Optimising multimodal fusion for biometric identification systems

    Get PDF
    Biometric systems are automatic means for imitating the human brain’s ability of identifying and verifying other humans by their behavioural and physiological characteristics. A system, which uses more than one biometric modality at the same time, is known as a multimodal system. Multimodal biometric systems consolidate the evidence presented by multiple biometric sources and typically provide better recognition performance compared to systems based on a single biometric modality. This thesis addresses some issues related to the implementation of multimodal biometric identity verification systems. The thesis assesses the feasibility of using commercial offthe-shelf products to construct deployable multimodal biometric system. It also identifies multimodal biometric fusion as a challenging optimisation problem when one considers the presence of several configurations and settings, in particular the verification thresholds adopted by each biometric device and the decision fusion algorithm implemented for a particular configuration. The thesis proposes a novel approach for the optimisation of multimodal biometric systems based on the use of genetic algorithms for solving some of the problems associated with the different settings. The proposed optimisation method also addresses some of the problems associated with score normalization. In addition, the thesis presents an analysis of the performance of different fusion rules when characterising the system users as sheep, goats, lambs and wolves. The results presented indicate that the proposed optimisation method can be used to solve the problems associated with threshold settings. This clearly demonstrates a valuable potential strategy that can be used to set a priori thresholds of the different biometric devices before using them. The proposed optimisation architecture addressed the problem of score normalisation, which makes it an effective “plug-and-play” design philosophy to system implementation. The results also indicate that the optimisation approach can be used for effectively determining the weight settings, which is used in many applications for varying the relative importance of the different performance parameters
    corecore