1,651 research outputs found
IIFA: Modular Inter-app Intent Information Flow Analysis of Android Applications
Android apps cooperate through message passing via intents. However, when
apps do not have identical sets of privileges inter-app communication (IAC) can
accidentally or maliciously be misused, e.g., to leak sensitive information
contrary to users expectations. Recent research considered static program
analysis to detect dangerous data leaks due to inter-component communication
(ICC) or IAC, but suffers from shortcomings with respect to precision,
soundness, and scalability. To solve these issues we propose a novel approach
for static ICC/IAC analysis. We perform a fixed-point iteration of ICC/IAC
summary information to precisely resolve intent communication with more than
two apps involved. We integrate these results with information flows generated
by a baseline (i.e. not considering intents) information flow analysis, and
resolve if sensitive data is flowing (transitively) through components/apps in
order to be ultimately leaked. Our main contribution is the first fully
automatic sound and precise ICC/IAC information flow analysis that is scalable
for realistic apps due to modularity, avoiding combinatorial explosion: Our
approach determines communicating apps using short summaries rather than
inlining intent calls, which often requires simultaneously analyzing all tuples
of apps. We evaluated our tool IIFA in terms of scalability, precision, and
recall. Using benchmarks we establish that precision and recall of our
algorithm are considerably better than prominent state-of-the-art analyses for
IAC. But foremost, applied to the 90 most popular applications from the Google
Playstore, IIFA demonstrated its scalability to a large corpus of real-world
apps. IIFA reports 62 problematic ICC-/IAC-related information flows via two or
more apps/components
Generalizing Permissive-Upgrade in Dynamic Information Flow Analysis
Preventing implicit information flows by dynamic program analysis requires
coarse approximations that result in false positives, because a dynamic monitor
sees only the executed trace of the program. One widely deployed method is the
no-sensitive-upgrade check, which terminates a program whenever a variable's
taint is upgraded (made more sensitive) due to a control dependence on tainted
data. Although sound, this method is restrictive, e.g., it terminates the
program even if the upgraded variable is never used subsequently. To counter
this, Austin and Flanagan introduced the permissive-upgrade check, which allows
a variable upgrade due to control dependence, but marks the variable
"partially-leaked". The program is stopped later if it tries to use the
partially-leaked variable. Permissive-upgrade handles the dead-variable
assignment problem and remains sound. However, Austin and Flanagan develop
permissive-upgrade only for a two-point (low-high) security lattice and
indicate a generalization to pointwise products of such lattices. In this
paper, we develop a non-trivial and non-obvious generalization of
permissive-upgrade to arbitrary lattices. The key difficulty lies in finding a
suitable notion of partial leaks that is both sound and permissive and in
developing a suitable definition of memory equivalence that allows an inductive
proof of soundness
Information Flow Control in WebKit's JavaScript Bytecode
Websites today routinely combine JavaScript from multiple sources, both
trusted and untrusted. Hence, JavaScript security is of paramount importance. A
specific interesting problem is information flow control (IFC) for JavaScript.
In this paper, we develop, formalize and implement a dynamic IFC mechanism for
the JavaScript engine of a production Web browser (specifically, Safari's
WebKit engine). Our IFC mechanism works at the level of JavaScript bytecode and
hence leverages years of industrial effort on optimizing both the source to
bytecode compiler and the bytecode interpreter. We track both explicit and
implicit flows and observe only moderate overhead. Working with bytecode
results in new challenges including the extensive use of unstructured control
flow in bytecode (which complicates lowering of program context taints),
unstructured exceptions (which complicate the matter further) and the need to
make IFC analysis permissive. We explain how we address these challenges,
formally model the JavaScript bytecode semantics and our instrumentation, prove
the standard property of termination-insensitive non-interference, and present
experimental results on an optimized prototype
Range corrections in Proton Halo Nuclei
We analyze the effects of finite-range corrections in halo effective field
theory for S-wave proton halo nuclei. We calculate the charge radius to
next-to-leading order and the astrophysical S-factor for low-energy proton
capture to fifth order in the low-energy expansion. As an application, we
confront our results with experimental data for the S-factor for proton capture
on Oxygen-16 into the excited state of Fluorine-17. Our low-energy
theory is characterized by a systematic low-energy expansion, which can be used
to quantify an energy-dependent model error to be utilized in data fitting.
Finally, we show that the existence of proton halos is suppressed by the need
for two fine tunings in the underlying theory.Comment: 30pages, 12 figure
Constraining Low-Energy Proton Capture on Beryllium-7 through Charge Radius Measurements
In this paper, we point out that a measurement of the charge radius of
Boron-8 provides indirect access to the S-factor for radiative proton capture
on Beryllium-7 at low energies. We use leading-order halo effective field
theory to explore this correlation and we give a relation between the charge
radius and the S-factor. Furthermore, we present important technical aspects
relevant to the renormalization of pointlike P-wave interactions in the
presence of a repulsive Coulomb interaction.Comment: Accepted for publication in European Physical Journal A. 29 pages, 9
figure
A spatial panel data version of the knowledge capital model
This paper attempts to analyze the impact of knowledge and knowledge spillovers on regional total factor productivity (TFP) in Europe. Regional patent stocks are used as a proxy for knowledge, and TFP is measured in terms of a superlative index. We follow Fischer et. al (2008) by using a spatial-spillover model and a data set covering 203 regions for six time periods. In order to estimate the impact of knowledge stocks we use a spatial autoregressive model with random effects, which allows for three kinds of spatial dependence: Spatial correlation in the innovations, the exogenous and the endogenous variables. The results suggest that there is a significant positive impact of knowledge on regional TFP levels, and that knowledge spills over to neighboring regions. These spillovers decay exponentially with distance at a rate of 8%. Using Monte Carlo simulations we calculate the distribution of direct and indirect effects. The average elasticity of a region's TFP with respect to its own knowledge stock is 0.2 and highly significant. The average effect of all other regions' TFP is about 50% higher, which confirms that the cross-country externalities are important in the measuring of the impact.
Dirac fermion wave guide networks on topological insulator surfaces
Magnetic texturing on the surface of a topological insulator allows the
design of wave guide networks and beam splitters for domain-wall Dirac
fermions. Guided by simple analytic arguments we model a Dirac fermion
interferometer consisting of two parallel pathways, whereby a newly developed
staggered-grid leap-frog discretization scheme in 2+1 dimensions with absorbing
boundary conditions is employed. The net transmission can be tuned between
constructive to destructive interference, either by variation of the
magnetization (path length) or an applied bias (wave length). Based on this
principle, a Dirac fermion transistor is proposed. Extensions to more general
networks are discussed.Comment: Submitted to PR
CEO Ownership Relevant for Firm Performance? - A Study of the Swedish Market
The purpose of this thesis is to examine if and how CEO ownership affects firm performance on Swedish companies depending on the market conditions. We would also like to examine the question if this relationship still holds when taking the possibility of an endogeneity determined relationship into account. A quantitative approach using regression analysis and descriptive statistics have been used. The regression analysis was conducted with an ordinary least square regression and a two-stage least square regression. The information has been attained from annual reports and Datastream. The study has a deductive approach. The theoretical perspective has been derived from classical agency theory as well as entrenchment theory. Companies listed at Nasdaq OMX Stockholm Large and Mid Cap during 2000-2006 has been empirically studied to obtain the data selected. Results from OLS regressions confirms a positive relation between firm performance and CEO ownership on the Swedish market. The relationship holds for our bear market period (2000-2002) as well as for the whole period (2000-2006). During bull market the relationship cannot be statistically supported. The results imply that the market condition affects the relationship; hence external forces have an impact on the result. Thus a Two-stage least square regression was conducted in order to examine if the relationship could be endogeneity determined and the result is that the relationship could be determined by exogenous forces and thus; the relationship do not hold
- …