1,434 research outputs found

    Development of an electrical model of a resistive micromegas

    Full text link
    We have developped a model to simulate the behavior of a resistive micromegas (MICROMEsh GAseous Structure) detector to a discharge using an electronic software (Virtuoso)

    Practical Schemes For Privacy & Security Enhanced RFID

    Full text link
    Proper privacy protection in RFID systems is important. However, many of the schemes known are impractical, either because they use hash functions instead of the more hardware efficient symmetric encryption schemes as a efficient cryptographic primitive, or because they incur a rather costly key search time penalty at the reader. Moreover, they do not allow for dynamic, fine-grained access control to the tag that cater for more complex usage scenarios. In this paper we investigate such scenarios, and propose a model and corresponding privacy friendly protocols for efficient and fine-grained management of access permissions to tags. In particular we propose an efficient mutual authentication protocol between a tag and a reader that achieves a reasonable level of privacy, using only symmetric key cryptography on the tag, while not requiring a costly key-search algorithm at the reader side. Moreover, our protocol is able to recover from stolen readers.Comment: 18 page

    Parameterized Complexity of the k-anonymity Problem

    Full text link
    The problem of publishing personal data without giving up privacy is becoming increasingly important. An interesting formalization that has been recently proposed is the kk-anonymity. This approach requires that the rows of a table are partitioned in clusters of size at least kk and that all the rows in a cluster become the same tuple, after the suppression of some entries. The natural optimization problem, where the goal is to minimize the number of suppressed entries, is known to be APX-hard even when the records values are over a binary alphabet and k=3k=3, and when the records have length at most 8 and k=4k=4 . In this paper we study how the complexity of the problem is influenced by different parameters. In this paper we follow this direction of research, first showing that the problem is W[1]-hard when parameterized by the size of the solution (and the value kk). Then we exhibit a fixed parameter algorithm, when the problem is parameterized by the size of the alphabet and the number of columns. Finally, we investigate the computational (and approximation) complexity of the kk-anonymity problem, when restricting the instance to records having length bounded by 3 and k=3k=3. We show that such a restriction is APX-hard.Comment: 22 pages, 2 figure

    On the Complexity of tt-Closeness Anonymization and Related Problems

    Full text link
    An important issue in releasing individual data is to protect the sensitive information from being leaked and maliciously utilized. Famous privacy preserving principles that aim to ensure both data privacy and data integrity, such as kk-anonymity and ll-diversity, have been extensively studied both theoretically and empirically. Nonetheless, these widely-adopted principles are still insufficient to prevent attribute disclosure if the attacker has partial knowledge about the overall sensitive data distribution. The tt-closeness principle has been proposed to fix this, which also has the benefit of supporting numerical sensitive attributes. However, in contrast to kk-anonymity and ll-diversity, the theoretical aspect of tt-closeness has not been well investigated. We initiate the first systematic theoretical study on the tt-closeness principle under the commonly-used attribute suppression model. We prove that for every constant tt such that 0≤t<10\leq t<1, it is NP-hard to find an optimal tt-closeness generalization of a given table. The proof consists of several reductions each of which works for different values of tt, which together cover the full range. To complement this negative result, we also provide exact and fixed-parameter algorithms. Finally, we answer some open questions regarding the complexity of kk-anonymity and ll-diversity left in the literature.Comment: An extended abstract to appear in DASFAA 201

    Test in a beam of large-area Micromegas chambers for sampling calorimetry

    Full text link
    Application of Micromegas for sampling calorimetry puts specific constraints on the design and performance of this gaseous detector. In particular, uniform and linear response, low noise and stability against high ionisation density deposits are prerequisites to achieving good energy resolution. A Micromegas-based hadronic calorimeter was proposed for an application at a future linear collider experiment and three technologically advanced prototypes of 1Ă—\times1 m2^{2} were constructed. Their merits relative to the above-mentioned criteria are discussed on the basis of measurements performed at the CERN SPS test-beam facility

    Adaptive Alert Management for Balancing Optimal Performance among Distributed CSOCs using Reinforcement Learning

    Get PDF
    Large organizations typically have Cybersecurity Operations Centers (CSOCs) distributed at multiple locations that are independently managed, and they have their own cybersecurity analyst workforce. Under normal operating conditions, the CSOC locations are ideally staffed such that the alerts generated from the sensors in a work-shift are thoroughly investigated by the scheduled analysts in a timely manner. Unfortunately, when adverse events such as increase in alert arrival rates or alert investigation rates occur, alerts have to wait for a longer duration for analyst investigation, which poses a direct risk to organizations. Hence, our research objective is to mitigate the impact of the adverse events by dynamically and autonomously re-allocating alerts to other location(s) such that the performances of all the CSOC locations remain balanced. This is achieved through the development of a novel centralized adaptive decision support system whose task is to re-allocate alerts from the affected locations to other locations. This re-allocation decision is non-trivial because the following must be determined: (1) timing of a re-allocation decision, (2) number of alerts to be re-allocated, and (3) selection of the locations to which the alerts must be distributed. The centralized decision-maker (henceforth referred to as agent) continuously monitors and controls the level of operational effectiveness-LOE (a quantified performance metric) of all the locations. The agent's decision-making framework is based on the principles of stochastic dynamic programming and is solved using reinforcement learning (RL). In the experiments, the RL approach is compared with both rule-based and load balancing strategies. By simulating real-world scenarios, learning the best decisions for the agent, and applying the decisions on sample realizations of the CSOC's daily operation, the results show that the RL agent outperforms both approaches by generating (near-) optimal decisions that maintain a balanced LOE among the CSOC locations. Furthermore, the scalability experiments highlight the practicality of adapting the method to a large number of CSOC locations

    Empowering Owners with Control in Digital Data Markets

    Get PDF
    We propose an approach for allowing data owners to trade their data in digital data market scenarios, while keeping control over them. Our solution is based on a combination of selective encryption and smart contracts deployed on a blockchain, and ensures that only authorized users who paid an agreed amount can access a data item. We propose a safe interaction protocol for regulating the interplay between a data owner and subjects wishing to purchase (a subset of) her data, and an audit process for counteracting possible misbehaviors by any of the interacting parties. Our solution aims to make a step towards the realization of data market platforms where owners can benefit from trading their data while maintaining control

    Synthetic sequence generator for recommender systems - memory biased random walk on sequence multilayer network

    Full text link
    Personalized recommender systems rely on each user's personal usage data in the system, in order to assist in decision making. However, privacy policies protecting users' rights prevent these highly personal data from being publicly available to a wider researcher audience. In this work, we propose a memory biased random walk model on multilayer sequence network, as a generator of synthetic sequential data for recommender systems. We demonstrate the applicability of the synthetic data in training recommender system models for cases when privacy policies restrict clickstream publishing.Comment: The new updated version of the pape

    Supporting Concurrency and Multiple Indexes in Private Access to Outsourced Data

    Get PDF
    Data outsourcing has recently emerged as a successful solution allowing individuals and organizations to delegate data and service management to external third parties. A major challenge in the data outsourcing scenario is how to guarantee proper privacy protection against the external server. Recent promising approaches rely on the organization of data in indexing structures that use encryption and the dynamic allocation of encrypted data to physical blocks for destroying the otherwise static relationship between data and the blocks in which they are stored. However, dynamic data allocation implies the need to re-write blocks at every read access, thus requesting exclusive locks that can affect concurrency. Also, these solutions only support search conditions on the values of the attribute used for building the indexing structure. In this paper, we present an approach that overcomes such limitations by extending the recently proposed shuffle index structure with support for concurrency and multiple indexes. Support for concurrency relies on the use of several differential versions of the data index that are periodically reconciled and applied to the main data structure. Support for multiple indexes relies on the definition of secondary shuffle indexes that are then combined with the primary index in a single data structure whose content and allocation is unintelligible to the server. We show how using such differential versions and combined index structure guarantees privacy, provides support for concurrent accesses and multiple search conditions, and considerably increases the performance of the system and the applicability of the proposed solution

    Open world reasoning in semantics-aware access control: A preliminary study

    Get PDF
    We address the relationships between theoretical foundations of Description Logics and practical applications of security-oriented Semantic Web techniques. We first describe the advantages of semantics-aware Access Control and review the state of the art; we also introduce the basics of Description Logics and the novel semantics they share. Then we translate the principle underlying the Little House Problem of DL into a real-world use case: by applying Open World Reasoning to the Knowledge Base modelling a Virtual Organization, we derive information not achievable with traditional Access Control methodologies. With this example, we also show that a general problem such as ontology mapping can take advantage of the enhanced semantics underlying OWL Lite and OWL DL to handle under-specified concepts
    • …
    corecore