10 research outputs found

    DYST (Did You See That?): An Amplified Covert Channel That Points To Previously Seen Data

    Full text link
    Covert channels are unforeseen and stealthy communication channels that enable manifold adversary scenarios. However, they can also allow the exchange of confidential information by journalists. All covert channels described until now therefore need to craft seemingly legitimate information flows for their information exchange, mimicking unsuspicious behavior. In this paper, we present DYST, which represents a new class of covert channels we call history covert channels jointly with the new paradigm of covert channel amplification. History covert channels can communicate almost exclusively by pointing to unaltered legitimate traffic created by regular network nodes. Only a negligible fraction of the covert communication process requires the transfer of actual covert channel information by the covert channel's sender. This allows, for the first time, an amplification of the covert channel's message size, i.e., minimizing the fraction of actually transferred secret data by a covert channel's sender in relation to the overall secret data being exchanged. We extend the current taxonomy for covert channels to show how history channels can be categorized. We describe multiple scenarios in which history covert channels can be realized, theoretically analyze the characteristics of these channels and show how their configuration can be optimized for different implementations. We further evaluate the robustness and detectability of history covert channels.Comment: 18 pages, rev

    Uncertainty-aware spot rejection rate as quality metric for proton therapy using a digital tracking calorimeter

    Get PDF
    Objective. Proton therapy is highly sensitive to range uncertainties due to the nature of the dose deposition of charged particles. To ensure treatment quality, range verification methods can be used to verify that the individual spots in a pencil beam scanning treatment fraction match the treatment plan. This study introduces a novel metric for proton therapy quality control based on uncertainties in range verification of individual spots. Approach. We employ uncertainty-aware deep neural networks to predict the Bragg peak depth in an anthropomorphic phantom based on secondary charged particle detection in a silicon pixel telescope designed for proton computed tomography. The subsequently predicted Bragg peak positions, along with their uncertainties, are compared to the treatment plan, rejecting spots which are predicted to be outside the 95% confidence interval. The such-produced spot rejection rate presents a metric for the quality of the treatment fraction. Main results. The introduced spot rejection rate metric is shown to be well-defined for range predictors with well-calibrated uncertainties. Using this method, treatment errors in the form of lateral shifts can be detected down to 1 mm after around 1400 treated spots with spot intensities of 1 × 107 protons. The range verification model used in this metric predicts the Bragg peak depth to a mean absolute error of 1.107 ± 0.015 mm. Significance. Uncertainty-aware machine learning has potential applications in proton therapy quality control. This work presents the foundation for future developments in this area.publishedVersio

    Uncertainty-aware spot rejection rate as quality metric for proton therapy using a digital tracking calorimeter

    Get PDF
    Objective. Proton therapy is highly sensitive to range uncertainties due to the nature of the dose deposition of charged particles. To ensure treatment quality, range verification methods can be used to verify that the individual spots in a pencil beam scanning treatment fraction match the treatment plan. This study introduces a novel metric for proton therapy quality control based on uncertainties in range verification of individual spots. Approach. We employ uncertainty-aware deep neural networks to predict the Bragg peak depth in an anthropomorphic phantom based on secondary charged particle detection in a silicon pixel telescope designed for proton computed tomography. The subsequently predicted Bragg peak positions, along with their uncertainties, are compared to the treatment plan, rejecting spots which are predicted to be outside the 95% confidence interval. The such-produced spot rejection rate presents a metric for the quality of the treatment fraction. Main results. The introduced spot rejection rate metric is shown to be well-defined for range predictors with well-calibrated uncertainties. Using this method, treatment errors in the form of lateral shifts can be detected down to 1 mm after around 1400 treated spots with spot intensities of 1 × 107 protons. The range verification model used in this metric predicts the Bragg peak depth to a mean absolute error of 1.107 ± 0.015 mm. Significance. Uncertainty-aware machine learning has potential applications in proton therapy quality control. This work presents the foundation for future developments in this area

    Exploration of differentiability in a proton computed tomography simulation framework

    Get PDF
    Objective. Gradient-based optimization using algorithmic derivatives can be a useful technique to improve engineering designs with respect to a computer-implemented objective function. Likewise, uncertainty quantification through computer simulations can be carried out by means of derivatives of the computer simulation. However, the effectiveness of these techniques depends on how ‘well-linearizable’ the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications. Approach. This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques. Main results. The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. However, the observed high density and magnitude of jumps was likely to preclude most meaningful uses of the derivatives. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path, and could be reduced in magnitude by a ‘fuzzy voxels’ approach. The investigated jumps in the MC function arose from local changes in the control flow that affected the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem. Significance. Besides the technical challenges of merely applying AD to existing software projects, the MC and MBIR codes must be adapted to compute smoother functions. For the MBIR code, we presented one possible approach for this while for the MC code, this will be subject to further research. For the tracking subprocedure, further research on surrogate models is necessary

    WoDiCoF - A Testbed for the Evaluation of (Parallel) Covert Channel Detection Algorithms

    No full text
    With the increasing number of steganography-capable malware and the increasing trend of stealthy data exfiltrations, network covert channels are becoming a crucial security threat - also for critical infrastructures (CIs): network covert channels enable the stealthy remote-control of malware nested in a CI and allow to exfiltrate sensitive data, such as sensor values, firmware or configuration parameters. We present WoDiCoF, a distributed testbed, accessible for the international research community to perform a unified evaluation of detection algorithms for network covert channels. In comparison to existing works, our testbed is designed for upcoming big- data scenarios, in which huge traffic recordings must be analyzed for covert channels. It is the first testbed to allow the testing of parallel detection algorithms. To evaluateWoDiCoF, we took a detection algorithm published in ACM CCS/TISSEC, verified several of the original results and enhanced the understanding of its performance by considering previously unconsidered parameters. By parallelizing the algorithm, we could moreover achieve a speed-up of 2.89 with three nodes

    Investigating particle track topology for range telescopes in particle radiography using convolutional neural networks

    No full text
    Background: Proton computed tomography (pCT) and radiography (pRad) are proposed modalities for improved treatment plan accuracy and in situ treatment validation in proton therapy. The pCT system of the Bergen pCT collaboration is able to handle very high particle intensities by means of track reconstruction. However, incorrectly reconstructed and secondary tracks degrade the image quality. We have investigated whether a convolutional neural network (CNN)-based filter is able to improve the image quality.  Material and methods: The CNN was trained by simulation and reconstruction of tens of millions of proton and helium tracks. The CNN filter was then compared to simple energy loss threshold methods using the Area Under the Receiver Operating Characteristics curve (AUROC), and by comparing the image quality and Water Equivalent Path Length (WEPL) error of proton and helium radiographs filtered with the same methods.  Results: The CNN method led to a considerable improvement of the AUROC, from 74.3% to 97.5% with protons and from 94.2% to 99.5% with helium. The CNN filtering reduced the WEPL error in the helium radiograph from 1.03 mm to 0.93 mm while no improvement was seen in the CNN filtered pRads. Conclusion: The CNN improved the filtering of proton and helium tracks. Only in the helium radiograph did this lead to improved image quality

    Investigating particle track topology for range telescopes in particle radiography using convolutional neural networks

    No full text
    Background: Proton computed tomography (pCT) and radiography (pRad) are proposed modalities for improved treatment plan accuracy and in situ treatment validation in proton therapy. The pCT system of the Bergen pCT collaboration is able to handle very high particle intensities by means of track reconstruction. However, incorrectly reconstructed and secondary tracks degrade the image quality. We have investigated whether a convolutional neural network (CNN)-based filter is able to improve the image quality.  Material and methods: The CNN was trained by simulation and reconstruction of tens of millions of proton and helium tracks. The CNN filter was then compared to simple energy loss threshold methods using the Area Under the Receiver Operating Characteristics curve (AUROC), and by comparing the image quality and Water Equivalent Path Length (WEPL) error of proton and helium radiographs filtered with the same methods.  Results: The CNN method led to a considerable improvement of the AUROC, from 74.3% to 97.5% with protons and from 94.2% to 99.5% with helium. The CNN filtering reduced the WEPL error in the helium radiograph from 1.03 mm to 0.93 mm while no improvement was seen in the CNN filtered pRads. Conclusion: The CNN improved the filtering of proton and helium tracks. Only in the helium radiograph did this lead to improved image quality

    Exploration of Differentiability in a Proton Computed Tomography Simulation Framework

    No full text
    Objective. Algorithmic differentiation (AD) can be a useful technique to numerically optimize design and algorithmic parameters by, and quantify uncertainties in, computer simulations. However, the effectiveness of AD depends on how 'well-linearizable' the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications. Approach. This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques. Main results. The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path. Jumps in the MC function likely arose from changes in the control flow that affect the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem. Significance. The MC and MBIR codes are ready for the integration of AD, and further research on surrogate models for the tracking subprocedure is necessary

    X. Literaturverzeichnis

    No full text
    corecore