1,857 research outputs found

    Ensemble Kalman Filter Assimilation of ERT Data for Numerical Modeling of Seawater Intrusion in a Laboratory Experiment

    Get PDF
    Seawater intrusion in coastal aquifers is a worldwide problem exacerbated by aquifer overexploitation and climate changes. To limit the deterioration of water quality caused by saline intrusion, research studies are needed to identify and assess the performance of possible countermeasures, e.g., underground barriers. Within this context, numerical models are fundamental to fully understand the process and for evaluating the effectiveness of the proposed solutions to contain the saltwater wedge; on the other hand, they are typically affected by uncertainty on hydrogeological parameters, as well as initial and boundary conditions. Data assimilation methods such as the ensemble Kalman filter (EnKF) represent promising tools that can reduce such uncertainties. Here, we present an application of the EnKF to the numerical modeling of a laboratory experiment where seawater intrusion was reproduced in a specifically designed sandbox and continuously monitored with electrical resistivity tomography (ERT). Combining EnKF and the SUTRA model for the simulation of density-dependent flow and transport in porous media, we assimilated the collected ERT data by means of joint and sequential assimilation approaches. In the joint approach, raw ERT data (electrical resistances) are assimilated to update both salt concentration and soil parameters, without the need for an electrical inversion. In the sequential approach, we assimilated electrical conductivities computed from a previously performed electrical inversion. Within both approaches, we suggest dual-step update strategies to minimize the effects of spurious correlations in parameter estimation. The results show that, in both cases, ERT data assimilation can reduce the uncertainty not only on the system state in terms of salt concentration, but also on the most relevant soil parameters, i.e., saturated hydraulic conductivity and longitudinal dispersivity. However, the sequential approach is more prone to filter inbreeding due to the large number of observations assimilated compared to the ensemble size

    Optimal Information-Theoretic Wireless Location Verification

    Full text link
    We develop a new Location Verification System (LVS) focussed on network-based Intelligent Transport Systems and vehicular ad hoc networks. The algorithm we develop is based on an information-theoretic framework which uses the received signal strength (RSS) from a network of base-stations and the claimed position. Based on this information we derive the optimal decision regarding the verification of the user's location. Our algorithm is optimal in the sense of maximizing the mutual information between its input and output data. Our approach is based on the practical scenario in which a non-colluding malicious user some distance from a highway optimally boosts his transmit power in an attempt to fool the LVS that he is on the highway. We develop a practical threat model for this attack scenario, and investigate in detail the performance of the LVS in terms of its input/output mutual information. We show how our LVS decision rule can be implemented straightforwardly with a performance that delivers near-optimality under realistic threat conditions, with information-theoretic optimality approached as the malicious user moves further from the highway. The practical advantages our new information-theoretic scheme delivers relative to more traditional Bayesian verification frameworks are discussed.Comment: Corrected typos and introduced new threat model

    Intrusion detection by machine learning = Behatolás detektálás gépi tanulás által

    Get PDF
    Since the early days of information technology, there have been many stakeholders who used the technological capabilities for their own benefit, be it legal operations, or illegal access to computational assets and sensitive information. Every year, businesses invest large amounts of effort into upgrading their IT infrastructure, yet, even today, they are unprepared to protect their most valuable assets: data and knowledge. This lack of protection was the main reason for the creation of this dissertation. During this study, intrusion detection, a field of information security, is evaluated through the use of several machine learning models performing signature and hybrid detection. This is a challenging field, mainly due to the high velocity and imbalanced nature of network traffic. To construct machine learning models capable of intrusion detection, the applied methodologies were the CRISP-DM process model designed to help data scientists with the planning, creation and integration of machine learning models into a business information infrastructure, and design science research interested in answering research questions with information technology artefacts. The two methodologies have a lot in common, which is further elaborated in the study. The goals of this dissertation were two-fold: first, to create an intrusion detector that could provide a high level of intrusion detection performance measured using accuracy and recall and second, to identify potential techniques that can increase intrusion detection performance. Out of the designed models, a hybrid autoencoder + stacking neural network model managed to achieve detection performance comparable to the best models that appeared in the related literature, with good detections on minority classes. To achieve this result, the techniques identified were synthetic sampling, advanced hyperparameter optimization, model ensembles and autoencoder networks. In addition, the dissertation set up a soft hierarchy among the different detection techniques in terms of performance and provides a brief outlook on potential future practical applications of network intrusion detection models as well

    Performance Evaluation of Network Anomaly Detection Systems

    Get PDF
    Nowadays, there is a huge and growing concern about security in information and communication technology (ICT) among the scientific community because any attack or anomaly in the network can greatly affect many domains such as national security, private data storage, social welfare, economic issues, and so on. Therefore, the anomaly detection domain is a broad research area, and many different techniques and approaches for this purpose have emerged through the years. Attacks, problems, and internal failures when not detected early may badly harm an entire Network system. Thus, this thesis presents an autonomous profile-based anomaly detection system based on the statistical method Principal Component Analysis (PCADS-AD). This approach creates a network profile called Digital Signature of Network Segment using Flow Analysis (DSNSF) that denotes the predicted normal behavior of a network traffic activity through historical data analysis. That digital signature is used as a threshold for volume anomaly detection to detect disparities in the normal traffic trend. The proposed system uses seven traffic flow attributes: Bits, Packets and Number of Flows to detect problems, and Source and Destination IP addresses and Ports, to provides the network administrator necessary information to solve them. Via evaluation techniques, addition of a different anomaly detection approach, and comparisons to other methods performed in this thesis using real network traffic data, results showed good traffic prediction by the DSNSF and encouraging false alarm generation and detection accuracy on the detection schema. The observed results seek to contribute to the advance of the state of the art in methods and strategies for anomaly detection that aim to surpass some challenges that emerge from the constant growth in complexity, speed and size of today’s large scale networks, also providing high-value results for a better detection in real time.Atualmente, existe uma enorme e crescente preocupação com segurança em tecnologia da informação e comunicação (TIC) entre a comunidade científica. Isto porque qualquer ataque ou anomalia na rede pode afetar a qualidade, interoperabilidade, disponibilidade, e integridade em muitos domínios, como segurança nacional, armazenamento de dados privados, bem-estar social, questões econômicas, e assim por diante. Portanto, a deteção de anomalias é uma ampla área de pesquisa, e muitas técnicas e abordagens diferentes para esse propósito surgiram ao longo dos anos. Ataques, problemas e falhas internas quando não detetados precocemente podem prejudicar gravemente todo um sistema de rede. Assim, esta Tese apresenta um sistema autônomo de deteção de anomalias baseado em perfil utilizando o método estatístico Análise de Componentes Principais (PCADS-AD). Essa abordagem cria um perfil de rede chamado Assinatura Digital do Segmento de Rede usando Análise de Fluxos (DSNSF) que denota o comportamento normal previsto de uma atividade de tráfego de rede por meio da análise de dados históricos. Essa assinatura digital é utilizada como um limiar para deteção de anomalia de volume e identificar disparidades na tendência de tráfego normal. O sistema proposto utiliza sete atributos de fluxo de tráfego: bits, pacotes e número de fluxos para detetar problemas, além de endereços IP e portas de origem e destino para fornecer ao administrador de rede as informações necessárias para resolvê-los. Por meio da utilização de métricas de avaliação, do acrescimento de uma abordagem de deteção distinta da proposta principal e comparações com outros métodos realizados nesta tese usando dados reais de tráfego de rede, os resultados mostraram boas previsões de tráfego pelo DSNSF e resultados encorajadores quanto a geração de alarmes falsos e precisão de deteção. Com os resultados observados nesta tese, este trabalho de doutoramento busca contribuir para o avanço do estado da arte em métodos e estratégias de deteção de anomalias, visando superar alguns desafios que emergem do constante crescimento em complexidade, velocidade e tamanho das redes de grande porte da atualidade, proporcionando também alta performance. Ainda, a baixa complexidade e agilidade do sistema proposto contribuem para que possa ser aplicado a deteção em tempo real
    corecore