2,918 research outputs found
Fine-grained timing using genetic programming
In previous work, we have demonstrated that it is possible to use Genetic Programming to minimise the resource consumption of software, such as its power consumption or execution time. In this paper, we investigate the extent to which Genetic Programming can be used to gain fine-grained control over software timing. We introduce the ideas behind our work, and carry out experimentation to find that Genetic Programming is indeed able to produce software with unusual and desirable timing properties, where it is not obvious how a manual approach could replicate such results. In general, we discover that Genetic Programming is most effective in controlling statistical properties of software rather than precise control over its timing for individual inputs. This control may find useful application in cryptography and embedded systems
Mitigating Branch-Shadowing Attacks on Intel SGX using Control Flow Randomization
Intel Software Guard Extensions (SGX) is a promising hardware-based
technology for protecting sensitive computations from potentially compromised
system software. However, recent research has shown that SGX is vulnerable to
branch-shadowing -- a side channel attack that leaks the fine-grained (branch
granularity) control flow of an enclave (SGX protected code), potentially
revealing sensitive data to the attacker. The previously-proposed defense
mechanism, called Zigzagger, attempted to hide the control flow, but has been
shown to be ineffective if the attacker can single-step through the enclave
using the recent SGX-Step framework.
Taking into account these stronger attacker capabilities, we propose a new
defense against branch-shadowing, based on control flow randomization. Our
scheme is inspired by Zigzagger, but provides quantifiable security guarantees
with respect to a tunable security parameter. Specifically, we eliminate
conditional branches and hide the targets of unconditional branches using a
combination of compile-time modifications and run-time code randomization.
We evaluated the performance of our approach by measuring the run-time
overhead of ten benchmark programs of SGX-Nbench in SGX environment
Barriers That Influence Adoption of ACL Injury Prevention Programs Among High School Girls’ Soccer Coaches
Click the PDF icon to download the abstract
Recommended from our members
Molecular Analysis of the Prostacyclin Receptor’s Interaction with the PDZ1 Domain of Its Adaptor Protein PDZK1
The prostanoid prostacyclin, or prostaglandin I2, plays an essential role in many aspects of cardiovascular disease. The actions of prostacyclin are mainly mediated through its activation of the prostacyclin receptor or, in short, the IP. In recent studies, the cytoplasmic carboxy-terminal domain of the IP was shown to bind several PDZ domains of the multi-PDZ adaptor PDZK1. The interaction between the two proteins was found to enhance cell surface expression of the IP and to be functionally important in promoting prostacyclin-induced endothelial cell migration and angiogenesis. To investigate the interaction of the IP with the first PDZ domain (PDZ1) of PDZK1, we generated a nine residue peptide (KK411IAACSLC417) containing the seven carboxy-terminal amino acids of the IP and measured its binding affinity to a recombinant protein corresponding to PDZ1 by isothermal titration calorimetry. We determined that the IP interacts with PDZ1 with a binding affinity of 8.2 µM. Using the same technique, we also determined that the farnesylated form of carboxy-terminus of the IP does not bind to PDZ1. To understand the molecular basis of these findings, we solved the high resolution crystal structure of PDZ1 bound to a 7-residue peptide derived from the carboxy-terminus of the non-farnesylated form of IP (411IAACSLC417). Analysis of the structure demonstrates a critical role for the three carboxy-terminal amino acids in establishing a strong interaction with PDZ1 and explains the inability of the farnesylated form of IP to interact with the PDZ1 domain of PDZK1 at least in vitro
Evaluating Modeling and Validation Strategies for Tooth Loss
Prediction models learn patterns from available data (training) and are then validated on new data (testing). Prediction modeling is increasingly common in dental research. We aimed to evaluate how different model development and validation steps affect the predictive performance of tooth loss prediction models of patients with periodontitis. Two independent cohorts (627 patients, 11,651 teeth) were followed over a mean ± SD 18.2 ± 5.6 y (Kiel cohort) and 6.6 ± 2.9 y (Greifswald cohort). Tooth loss and 10 patient- and tooth-level predictors were recorded. The impact of different model development and validation steps was evaluated: 1) model complexity (logistic regression, recursive partitioning, random forest, extreme gradient boosting), 2) sample size (full data set or 10%, 25%, or 75% of cases dropped at random), 3) prediction periods (maximum 10, 15, or 20 y or uncensored), and 4) validation schemes (internal or external by centers/time). Tooth loss was generally a rare event (880 teeth were lost). All models showed limited sensitivity but high specificity. Patients' age and tooth loss at baseline as well as probing pocket depths showed high variable importance. More complex models (random forest, extreme gradient boosting) had no consistent advantages over simpler ones (logistic regression, recursive partitioning). Internal validation (in sample) overestimated the predictive power (area under the curve up to 0.90), while external validation (out of sample) found lower areas under the curve (range 0.62 to 0.82). Reducing the sample size decreased the predictive power, particularly for more complex models. Censoring the prediction period had only limited impact. When the model was trained in one period and tested in another, model outcomes were similar to the base case, indicating temporal validation as a valid option. No model showed higher accuracy than the no-information rate. In conclusion, none of the developed models would be useful in a clinical setting, despite high accuracy. During modeling, rigorous development and external validation should be applied and reported accordingly
A Novel Scaffold-Based Hybrid Multicellular Model for Pancreatic Ductal Adenocarcinoma-Toward a Better Mimicry of the in vivo Tumor Microenvironment
With a very low survival rate, pancreatic ductal adenocarcinoma (PDAC) is a deadly disease. This has been primarily attributed to (i) its late diagnosis and (ii) its high resistance to current treatment methods. The latter specifically requires the development of robust, realistic in vitro models of PDAC, capable of accurately mimicking the in vivo tumor niche. Advancements in the field of tissue engineering (TE) have helped the development of such models for PDAC. Herein, we report for the first time a novel hybrid, polyurethane (PU) scaffold-based, long-term, multicellular (tri-culture) model of pancreatic cancer involving cancer cells, endothelial cells, and stellate cells. Recognizing the importance of ECM proteins for optimal growth of different cell types, the model consists of two different zones/compartments: an inner tumor compartment consisting of cancer cells [fibronectin (FN)-coated] and a surrounding stromal compartment consisting of stellate and endothelial cells [collagen I (COL)-coated]. Our developed novel hybrid, tri-culture model supports the proliferation of all different cell types for 35 days (5 weeks), which is the longest reported timeframe in vitro. Furthermore, the hybrid model showed extensive COL production by the cells, mimicking desmoplasia, one of PDAC's hallmark features. Fibril alignment of the stellate cells was observed, which attested to their activated state. All three cell types expressed various cell-specific markers within the scaffolds, throughout the culture period and showed cellular migration between the two zones of the hybrid scaffold. Our novel model has great potential as a low-cost tool for in vitro studies of PDAC, as well as for treatment screening
Quantifying Timing Leaks and Cost Optimisation
We develop a new notion of security against timing attacks where the attacker
is able to simultaneously observe the execution time of a program and the
probability of the values of low variables. We then show how to measure the
security of a program with respect to this notion via a computable estimate of
the timing leakage and use this estimate for cost optimisation.Comment: 16 pages, 2 figures, 4 tables. A shorter version is included in the
proceedings of ICICS'08 - 10th International Conference on Information and
Communications Security, 20-22 October, 2008 Birmingham, U
On EPR paradox, Bell's inequalities and experiments which prove nothing
This article shows that the there is no paradox. Violation of Bell's
inequalities should not be identified with a proof of non locality in quantum
mechanics. A number of past experiments is reviewed, and it is concluded that
the experimental results should be re-evaluated. The results of the experiments
with atomic cascade are shown not to contradict the local realism. The article
points out flaws in the experiments with down-converted photons. The
experiments with neutron interferometer on measuring the "contextuality" and
Bell-like inequalities are analyzed, and it is shown that the experimental
results can be explained without such notions. Alternative experiment is
proposed to prove the validity of local realism.Comment: 27 pages, 8 figures. I edited a little the text and abstract I
corrected equations (49) and (50
- …