350 research outputs found

    A Survey on Forensics and Compliance Auditing for Critical Infrastructure Protection

    Get PDF
    The broadening dependency and reliance that modern societies have on essential services provided by Critical Infrastructures is increasing the relevance of their trustworthiness. However, Critical Infrastructures are attractive targets for cyberattacks, due to the potential for considerable impact, not just at the economic level but also in terms of physical damage and even loss of human life. Complementing traditional security mechanisms, forensics and compliance audit processes play an important role in ensuring Critical Infrastructure trustworthiness. Compliance auditing contributes to checking if security measures are in place and compliant with standards and internal policies. Forensics assist the investigation of past security incidents. Since these two areas significantly overlap, in terms of data sources, tools and techniques, they can be merged into unified Forensics and Compliance Auditing (FCA) frameworks. In this paper, we survey the latest developments, methodologies, challenges, and solutions addressing forensics and compliance auditing in the scope of Critical Infrastructure Protection. This survey focuses on relevant contributions, capable of tackling the requirements imposed by massively distributed and complex Industrial Automation and Control Systems, in terms of handling large volumes of heterogeneous data (that can be noisy, ambiguous, and redundant) for analytic purposes, with adequate performance and reliability. The achieved results produced a taxonomy in the field of FCA whose key categories denote the relevant topics in the literature. Also, the collected knowledge resulted in the establishment of a reference FCA architecture, proposed as a generic template for a converged platform. These results are intended to guide future research on forensics and compliance auditing for Critical Infrastructure Protection.info:eu-repo/semantics/publishedVersio

    Survey on detecting and preventing web application broken access control attacks

    Get PDF
    Web applications are an essential component of the current wide range of digital services proposition including financial and governmental services as well as social networking and communications. Broken access control vulnerabilities pose a huge risk to that echo system because they allow the attacker to circumvent the allocated permissions and rights and perform actions that he is not authorized to perform. This paper gives a broad survey of the current research progress on approaches used to detect access control vulnerabilities exploitations and attacks in web application components. It categorizes these approaches based on their key techniques and compares the different detection methods in addition to evaluating their strengths and weaknesses. We also spotted and elaborated on some exciting research gaps found in the current literature, Finally, the paper summarizes the general detection approaches and suggests potential research directions for the future

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Computational acquisition of knowledge in small-data environments: a case study in the field of energetics

    Get PDF
    The UK’s defence industry is accelerating its implementation of artificial intelligence, including expert systems and natural language processing (NLP) tools designed to supplement human analysis. This thesis examines the limitations of NLP tools in small-data environments (common in defence) in the defence-related energetic-materials domain. A literature review identifies the domain-specific challenges of developing an expert system (specifically an ontology). The absence of domain resources such as labelled datasets and, most significantly, the preprocessing of text resources are identified as challenges. To address the latter, a novel general-purpose preprocessing pipeline specifically tailored for the energetic-materials domain is developed. The effectiveness of the pipeline is evaluated. Examination of the interface between using NLP tools in data-limited environments to either supplement or replace human analysis completely is conducted in a study examining the subjective concept of importance. A methodology for directly comparing the ability of NLP tools and experts to identify important points in the text is presented. Results show the participants of the study exhibit little agreement, even on which points in the text are important. The NLP, expert (author of the text being examined) and participants only agree on general statements. However, as a group, the participants agreed with the expert. In data-limited environments, the extractive-summarisation tools examined cannot effectively identify the important points in a technical document akin to an expert. A methodology for the classification of journal articles by the technology readiness level (TRL) of the described technologies in a data-limited environment is proposed. Techniques to overcome challenges with using real-world data such as class imbalances are investigated. A methodology to evaluate the reliability of human annotations is presented. Analysis identifies a lack of agreement and consistency in the expert evaluation of document TRL.Open Acces

    1-D broadside-radiating leaky-wave antenna based on a numerically synthesized impedance surface

    Get PDF
    A newly-developed deterministic numerical technique for the automated design of metasurface antennas is applied here for the first time to the design of a 1-D printed Leaky-Wave Antenna (LWA) for broadside radiation. The surface impedance synthesis process does not require any a priori knowledge on the impedance pattern, and starts from a mask constraint on the desired far-field and practical bounds on the unit cell impedance values. The designed reactance surface for broadside radiation exhibits a non conventional patterning; this highlights the merit of using an automated design process for a design well known to be challenging for analytical methods. The antenna is physically implemented with an array of metal strips with varying gap widths and simulation results show very good agreement with the predicted performance

    Advanced experimental techniques for SiPM characterization at cryogenic temperatures

    Get PDF
    The main topic of this thesis is the characterization of silicon photomultipliers (SiPM) detectors for the DUNE experiment, in particular for the ProtoDUNE-HD and DUNE-FD1 modules, and the development of novel experimental setups and techniques to perform this task. The initial overview of DUNE outlines the main features of the experiment (underlying physics, scientific program and current design for single components), giving particular attention to the first of the four DUNE far detector modules (DUNE-FD1). This includes the general working principles of a LArTPC, the current plan for the DUNE-FD1 design and lastly the role of SiPMs in this design (i.e. fundamental units of the photon detection system PDS). A brief description of the ProtoDUNE2-HD detector is also reported, since it shares the same PDS design with the DUNE-FD1 detector. A detailed description of the SiPM fundamental properties and working principles is also reported, discussing the most important parameters involved in their characterization (in the context of the DUNE experiment). The presented work falls in the context of the SiPM test campaign engaged by the DUNE PDS consortium to down-select and test SiPMs for both the ProtoDUNE-HD and DUNE-FD1 detectors. The Ferrara group took active participation on both the first test phase, that consisted in the full characterization of single sensors to down-select two SiPM models as most promising devices, and the second phase, which is a quality assurance test campaign for a large number of SiPMs of the selected models. The Ferrara group, together with the Bologna group, developed a custom apparatus (CACTUS) to perform fast and automatized characterizations in order to leverage the time involved in the aforementioned massive SiPM test campaign. This apparatus was used for test the ProtoDUNE2-HD SiPM production (6000 units), which resulted in all the sensors being within the DUNE specifications (failure rate around 0.05%), and it is currently operating for the DUNEFD1-HD SiPM characterization campaign (288000 units). The CACTUS system permits to characterize automatically up to 120 SiPMs in parallel, both at room temperature and at LN2 temperature, in a single measurement session, and it is planned to be installed in other 3 sites to join the DUNEFD1-HD SiPM characterization campaign in 2023.Il tema principale di questo trattato è lo sviluppo di apparati di misura e procedure sperimentali per la caratterizzazione delle proprietà optoelettroniche di fotomoltiplicatori al Silicio (SiPMs) nel contesto dell’esperimento DUNE (Deep Underground Neutrino Experiment), al fine di predisporre il loro utilizzo nei rivelatori ProtoDUNE-HD e DUNE-FD1. Inizialmente viene fornita una visione generale dell’esperimento DUNE, in cui vengono descritti i punti salienti del programma scientifico, la teoria fisica sottostante, e il design attualmente previsto per le singole componenti. La descrizione diventa più dettagliata per quanto riguarda il primo modulo del far detector (DUNE-FD1), dato che i SiPM in questione verranno utilizzati come unità fondamentale del sistema di foto-rivelazione (PDS) della camera a proiezione temporale ad argon liquido (LArTPC). Per completare la contestualizzazione del tema principale, il trattato contiene anche una descrizione dettagliata delle proprietà elettriche e optoelettroniche dei SiPM, dei modelli teorici adottati per descrivere il loro funzionamento e delle procedure comunemente adottate per caratterizzarli. La caratterizzazione dei SiPM per ProtoDUNE-HD e DUNE-FD1 svolta dalla divisione di Ferrara del Consorzio DUNE-PDS è l’argomento centrale di questo trattato. Questo contributo è consistito nella caratterizzazione di singoli sensori SiPM, a temperatura ambiente e alla temperatura di azoto liquido, che ha permesso la selezione dei modelli più promettenti per applicazioni nel contesto dell’esperimento DUNE, assieme alla definizione delle migliori condizioni di lavoro per massimizzare le loro performance. Il gruppo DUNE-PDS Ferrara sta contribuendo anche alla seconda fase della campagna di test, che prevede lo svolgimento di test di garanzia per un grande numero di SiPM. In particolare, il gruppo DUNE-PDS Ferrara, in collaborazione con il gruppo DUNE-PDS Bologna, hanno sviluppato un setup sperimentale per velocizzare questa procedura. Il sistema, denominato CACTUS, permette la caratterizzazione automatizzata di 120 sensori in parallelo a livello di caratteristica IV e conteggi di buio (DCR), sia a temperatura ambiente che in azoto liquido, in una singola sessione di misura. L’apparato CACTUS è stato già utilizzato per testare la produzione di SiPM per ProtoDUNE-HD, un prototipo in scala ridotta di DUNE-FD1 che condivide con esso lo stesso design a livello di singole componenti. Le 6000 unit`a testate con CACTUS per ProtoDUNE-HD non hanno riportato anomalie (failure rate circa 0.05%). Il sistema CACTUS verrà replicato anche nelle università di Milano Bicocca, Granada e Praga, queste si aggiungeranno ai siti di Ferrara e Bologna nella campagna di caratterizzazione di SiPM per la produzione DUNEFD1-HD, che conta 288000 sensor

    Gulf Cooperation Council Countries’ Electricity Sector Forecasting : Consumption Growth Issue and Renewable Energy Penetration Progress Challenges

    Get PDF
    The Gulf Cooperation Council (GCC) countries depend on substantial fossil fuel consumption to generate electricity which has resulted in significant environmental harm. Fossil fuels also represent the principal source of economic income in the region. Climate change is closely associated with the use of fossil fuels and has thus become the main motivation to search for alternative solutions, including solar and wind energy technologies, to eliminate their reliance on fossil fuels and the associated impacts upon climate. This research provides a comprehensive investigation of the consumption growth issue, together with an exploration of the potential of solar and wind energy resources, a strict follow-up to shed light on the renewable energy projects, as currently implemented in the GCC region, and a critical discussion of their prospects. The projects foreshadow the GCC countries’ ability to comply with future requirements and spearhead the renewable energy transition toward a more sustainable and equitable future. In addition, four forecasting models were developed to analyse the future performance of GCC power sectors, including solar and wind energy resources along with the ambient temperatures, based on 40 years of historical data. These were Monte Carlo Simulation (MCS), Brownian Motion (BM), and a seasonal autoregressive integrated moving average with exogenous factors (SARIMAX) model model-based time series, and bidirectional long short-term memory (BI-LSTM) and gated recurrent unit (GRU) model-based neural networks. The MCS and BM prediction models apply a regression analysis (which describes the behaviour of an instrument) to a large set of random trials so as to construct a credible set of probable future outcomes. The MCS and BM prediction models have proven to be an exceptional investigative solution for long-term prediction for different types of historical data, including: (i) four types of fossil fuel data; (ii) three types of solar irradiance data, (iii) wind speed data; and, (iv) temperature data. In addition, the prediction model is able to cope with large volumes of historical data and different intervals, including yearly, quarterly, and daily. The simplicity of implementation is a strength of MCS and BM techniques. The SARIMAX technique applies a time series approach with seasonal and exogenous influencing factors, an approach that helps to reduce the error values and improve the overall model accuracy, even in the case of close input and output dataset lengths. This iii research proposes a forecasting framework that applies the SARIMAX model to forecast the long-term performance of the electricity sector (including electricity consumption, generation, peak load, and installed capacity). The SARIMAX model was used to forecast the aforementioned factors in the GCC region for a forecasted period of 30 years from 2021 to 2050. The experimental findings indicate that the SARIMAX model has potential performance in terms of categorisation and consideration, as it has significantly improved forecasting accuracy when compared with simpler, autoregressive, integrated, moving average-based techniques.The BI-LSTM model has the advantage of manipulating information in two opposing directions and providing feedback to the same outputs via two different hidden layers. A BI-LSTM’s output layer concurrently receives information from both the backward and forward layers. The BI-LSTM prediction model was designed to predict solar irradiance which includes global horizontal irradiance (GHI), direct normal irradiance (DNI), and diffuse horizontal irradiance (DHI) for the next 169 hours. The findings demonstrate that the BI-LSTM model has an encouraging performance in terms of evaluation, with considerable accuracy for all three types of solar irradiance data from the six GCC countries. The model can handle different sizes of sequential data and generates low error metrics. The GRU prediction model automatically learned the features, used fewer training parameters, and required a shorter time to train as compared to other types of RNNs. The GRU model was designed to forecast 169 hours ahead in terms of forecasted wind speeds and temperature values based on 36 years of hourly interval historical data (1st January 1985 to 26th June 2021) collected from the GCC region. The findings notably indicate that the GRU model offers a promising performance, with significant prediction accuracies in terms of overfitting, reliability, resolution, efficiency, and generalisable processes. The GRU model is characterised by its superior performance and influential evaluation error metrics for wind speed and temperature fluctuations. Finally, the models aim to help address the issue of a lack of future planning and accurate analyses of the energy sector's forecasted performance and intermittency, providing a reliable forecasting technique which is a prerequisite for modern energy systems

    The value of community pharmacy incident reporting in optimising the safety and quality use of medicines

    Get PDF
    Medication safety has emerged as a healthcare priority with the launch of the World Health Organization’s third global patient safety challenge. Understanding the complex interplay between human and system factors that potentiate medication incidents can illuminate improvement opportunities in organisational safeguards and safe medication practices. This thesis aimed to develop, implement, and evaluate systematic incident reporting system (IRS) to identify, characterise and address risks to medication safety and quality use of medicines (QUM) in primary care. The study was conducted in 30-community pharmacies in Sydney, Australia, through a confidential and anonymous IRS called QUMwatch. The study used the Advanced Incident Management System (AIMS) taxonomy, which is a hierarchical classification system based on error theory. Analysis of 1,013 incident reports collected over 30 months, identified medication incidents (MIs) that affected patients over 65 years old, the prescribing stage, and medicines acting on the cardiovascular and nervous systems. Human, task, and organisational factors contributed to MIs, particularly healthcare providers' cognitive errors, communication problems, poor risk management, and safety culture. Factors that facilitated error recovery included individual attributes, appropriate intervention, effective communication, and the use of standardised protocols. Remedial actions included changes in care plans, dosages, reviews of medicines, and medicine cessation. The study evaluated the QUMwatch program's tools and methods using a mixed-methods approach and found that 16 out of 20 variables on the data collection form had over 90% complete data, and data consistency was high. Anonymity was the preferred method of reporting. The stimulatory package significantly raised the reporting rate from a baseline average of 32.4 to 77.3 reports/month (p < .001). The AIMS taxonomy for MIs had substantial validity for high-order medication processes for the Australian community pharmacy context. The study demonstrated the feasibility of a well-designed IRS in community pharmacy to identify MIs and to generate safety lessons and recommendations

    Strangeness production analysis in simulated proton-proton collisions at the LHC with the new ALICE Run 3 computing environment

    Get PDF
    During the second long shutdown of the Large Hadron Collider (end of 2018 to middle of 2022), the ALICE detector was the subject of a complete reworking of its detector, reconstruction and analysis software, to prepare for the analysis of collisions happening at an increased interaction rate and recorded continuously, without trigger. The recording in October 2021 of the pilot beam proton-proton collisions at sqrt{s} = 900 GeV and the production of Monte Carlo data simulating those beam conditions allowed an extensive test of this new detector as well as of the reconstruction and analysis tools. This work focuses in this context on the analysis of the V0 decays K0S and Lambda in a Monte Carlo simulation of the pilot beam collisions. Those decays are building blocks for the analysis of strange decays in general, a valuable probe of the quark-gluon plasma. Their transverse momentum spectra are computed from the data reconstructed using the new analysis software, and compared to the initial spectra generated by the simulation. The analysis suite succeeds in reproducing the generated spectrum. The lower reconstruction efficiency compared to analyses done before the upgrade prompted a loosening of the reconstruction cuts despite leading to additional computing load. An imperfection in the calibration of the first preliminary results from real data is also observed
    corecore