212 research outputs found
Service Security and Privacy as a Socio-Technical Problem: Literature review, analysis methodology and challenge domains
Published online September 2015 accepted: 15 September 2014Published online September 2015 accepted: 15 September 2014The security and privacy of the data that users transmit, more or less deliberately, to modern services is an open problem. It is not solely limited to the actual Internet traversal, a sub-problem vastly tackled by consolidated research in security protocol design and analysis. By contrast, it entails much broader dimensions pertaining to how users approach technology and understand the risks for the data they enter. For example, users may express cautious or distracted personas depending on the service and the point in time; further, pre-established paths of practice may lead them to neglect the intrusive privacy policy offered by a service, or the outdated protections adopted by another. The approach that sees the service security and privacy problem as a socio-technical one needs consolidation. With this motivation, the article makes a threefold contribution. It reviews the existing literature on service security and privacy, especially from the socio-technical standpoint. Further, it outlines a general research methodology aimed at layering the problem appropriately, at suggesting how to position existing findings, and ultimately at indicating where a transdisciplinary task force may fit in. The article concludes with the description of the three challenge domains of services whose security and privacy we deem open socio-technical problems, not only due to their inherent facets but also to their huge number of users
Optimal joint routing and link scheduling for real-time traffic in TDMA Wireless Mesh Networks
We investigate the problem of joint routing and link scheduling in Time-Division Multiple Access (TDMA) Wireless Mesh Networks (WMNs) carrying real-time traffic. We propose a framework that always computes a feasible solution (i.e. a set of paths and link activations) if there exists one, by optimally solving a mixed integer-non linear problem. Such solution can be computed in minutes or tens thereof for e.g. grids of up to 4x4 nodes. We also propose heuristics based on Lagrangian decomposition to compute suboptimal solutions considerably faster and/or for larger WMNs, up to about 50 nodes. We show that the heuristic solutions are near-optimal, and we exploit them to investigate the optimal placement of one or more gateways from a delay bound perspective
Optimal joint routing and link scheduling for real-time traffic in TDMA Wireless Mesh Networks
We investigate the problem of joint routing and link scheduling in Time-Division Multiple Access (TDMA) Wireless Mesh Networks (WMNs) carrying real-time traffic. We propose a framework that always computes a feasible solution (i.e. a set of paths and link activations) if there exists one, by optimally solving a mixed integer-non linear problem. Such solution can be computed in minutes or tens thereof for e.g. grids of up to 4x4 nodes. We also propose heuristics based on Lagrangian decomposition to compute suboptimal solutions considerably faster and/or for larger WMNs, up to about 50 nodes. We show that the heuristic solutions are near-optimal, and we exploit them to gain insight on the schedulability in WMN, i.e. to investigate the optimal placement of one or more gateways from a delay bound perspec-tive, and to investigate how the schedulability is affected by the transmission range
Single-photon detection and cryogenic reconfigurability in Lithium Niobate nanophotonic circuits
Lithium-Niobate-On-Insulator (LNOI) is emerging as a promising platform for
integrated quantum photonic technologies because of its high second-order
nonlinearity and compact waveguide footprint. Importantly, LNOI allows for
creating electro-optically reconfigurable circuits, which can be efficiently
operated at cryogenic temperature. Their integration with superconducting
nanowire single-photon detectors (SNSPDs) paves the way for realizing scalable
photonic devices for active manipulation and detection of quantum states of
light. Here we report the first demonstration of these two key components
integrated in a low loss (0.2 dB/cm) LNOI waveguide network. As an experimental
showcase of our technology, we demonstrate the combined operation of an
electrically tunable Mach-Zehnder interferometer and two waveguide-integrated
SNSPDs at its outputs. We show static reconfigurability of our system with a
bias-drift-free operation over a time of 12 hours, as well as high-speed
modulation at a frequency up to 1 GHz. Our results provide blueprints for
implementing complex quantum photonic devices on the LNOI platform
Remote secure object authentication: Secure sketches, fuzzy extractors, and security protocols
peer reviewedCoating objects with microscopic droplets of liquid crystals makes it possible to identify and authenticate objects as if they had biometric-like features: this is extremely valuable as an anti-counterfeiting measure. How to extract features from images has been studied elsewhere, but exchanging data about features is not enough if we wish to build secure cryptographic authentication protocols. What we need are authentication tokens (i.e., bitstrings), strategies to cope with noise, always present when processing images, and solutions to protect the original features so that it is impossible to reproduce them from the tokens. Secure sketches and fuzzy extractors are the cryptographic toolkits that offer these functionalities, but they must be instantiated to work with the peculiar specific features extracted from images of liquid crystals. We show how this can work and how we can obtain uniform, error-tolerant, and random strings, and how they are used to authenticate liquid crystal coated objects. Our protocol reminds an existing biometric-based protocol, but only apparently. Using the original protocol as-it-is would make the process vulnerable to an attack that exploits certain physical peculiarities of our liquid crystal coatings. Instead, our protocol is robust against the attack. We prove all our security claims formally, by modeling and verifying in Proverif, our protocol and its cryptographic schemes. We implement and benchmark our solution, measuring both the performance and the quality of authentication
Direct characterization of a nonlinear photonic circuit's wave function with laser light
© The Author(s) 2018. Integrated photonics is a leading platform for quantum technologies including nonclassical state generation 1, 2, 3, 4, demonstration of quantum computational complexity 5 and secure quantum communications 6. As photonic circuits grow in complexity, full quantum tomography becomes impractical, and therefore an efficient method for their characterization 7, 8 is essential. Here we propose and demonstrate a fast, reliable method for reconstructing the two-photon state produced by an arbitrary quadratically nonlinear optical circuit. By establishing a rigorous correspondence between the generated quantum state and classical sum-frequency generation measurements from laser light, we overcome the limitations of previous approaches for lossy multi-mode devices 9, 10. We applied this protocol to a multi-channel nonlinear waveguide network and measured a 99.28±0.31% fidelity between classical and quantum characterization. This technique enables fast and precise evaluation of nonlinear quantum photonic networks, a crucial step towards complex, large-scale, device production
Hybrid photonic integrated circuits for neuromorphic computing [Invited]
The burgeoning of artificial intelligence has brought great convenience to people’s lives as large-scale computational models have emerged. Artificial intelligence-related applications, such as autonomous driving, medical diagnosis, and speech recognition, have experienced remarkable progress in recent years; however, such systems require vast amounts of data for accurate inference and reliable performance, presenting challenges in both speed and power consumption. Neuromorphic computing based on photonic integrated circuits (PICs) is currently a subject of interest to achieve high-speed, energy-efficient, and low-latency data processing to alleviate some of these challenges. Herein, we present an overview of the current photonic platforms available, the materials which have the potential to be integrated with PICs to achieve further performance, and recent progress in hybrid devices for neuromorphic computing
Statistical design of personalized medicine interventions: The Clarification of Optimal Anticoagulation through Genetics (COAG) trial
<p>Abstract</p> <p>Background</p> <p>There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.</p> <p>Methods</p> <p>The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in <it>CYP2C9 </it>and <it>VKORC1</it>; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.</p> <p>Results</p> <p>We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either <it>CYP2C9 </it>or <it>VKORC1 </it>and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.</p> <p>Conclusions</p> <p>In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.</p> <p>Trial Registration</p> <p>clinicaltrials.gov: NCT00839657</p
Relations between lipoprotein(a) concentrations, LPA genetic variants, and the risk of mortality in patients with established coronary heart disease: a molecular and genetic association study
Background:
Lipoprotein(a) concentrations in plasma are associated with cardiovascular risk in the general population. Whether lipoprotein(a) concentrations or LPA genetic variants predict long-term mortality in patients with established coronary heart disease remains less clear.
Methods:
We obtained data from 3313 patients with established coronary heart disease in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study. We tested associations of tertiles of lipoprotein(a) concentration in plasma and two LPA single-nucleotide polymorphisms ([SNPs] rs10455872 and rs3798220) with all-cause mortality and cardiovascular mortality by Cox regression analysis and with severity of disease by generalised linear modelling, with and without adjustment for age, sex, diabetes diagnosis, systolic blood pressure, BMI, smoking status, estimated glomerular filtration rate, LDL-cholesterol concentration, and use of lipid-lowering therapy. Results for plasma lipoprotein(a) concentrations were validated in five independent studies involving 10 195 patients with established coronary heart disease. Results for genetic associations were replicated through large-scale collaborative analysis in the GENIUS-CHD consortium, comprising 106 353 patients with established coronary heart disease and 19 332 deaths in 22 studies or cohorts.
Findings:
The median follow-up was 9·9 years. Increased severity of coronary heart disease was associated with lipoprotein(a) concentrations in plasma in the highest tertile (adjusted hazard radio [HR] 1·44, 95% CI 1·14–1·83) and the presence of either LPA SNP (1·88, 1·40–2·53). No associations were found in LURIC with all-cause mortality (highest tertile of lipoprotein(a) concentration in plasma 0·95, 0·81–1·11 and either LPA SNP 1·10, 0·92–1·31) or cardiovascular mortality (0·99, 0·81–1·2 and 1·13, 0·90–1·40, respectively) or in the validation studies.
Interpretation:
In patients with prevalent coronary heart disease, lipoprotein(a) concentrations and genetic variants showed no associations with mortality. We conclude that these variables are not useful risk factors to measure to predict progression to death after coronary heart disease is established.
Funding:
Seventh Framework Programme for Research and Technical Development (AtheroRemo and RiskyCAD), INTERREG IV Oberrhein Programme, Deutsche Nierenstiftung, Else-Kroener Fresenius Foundation, Deutsche Stiftung für Herzforschung, Deutsche Forschungsgemeinschaft, Saarland University, German Federal Ministry of Education and Research, Willy Robert Pitzer Foundation, and Waldburg-Zeil Clinics Isny
- …