19,832 research outputs found
A General Framework for Static Profiling of Parametric Resource Usage
Traditional static resource analyses estimate the total resource usage of a
program, without executing it. In this paper we present a novel resource
analysis whose aim is instead the static profiling of accumulated cost, i.e.,
to discover, for selected parts of the program, an estimate or bound of the
resource usage accumulated in each of those parts. Traditional resource
analyses are parametric in the sense that the results can be functions on input
data sizes. Our static profiling is also parametric, i.e., our accumulated cost
estimates are also parameterized by input data sizes. Our proposal is based on
the concept of cost centers and a program transformation that allows the static
inference of functions that return bounds on these accumulated costs depending
on input data sizes, for each cost center of interest. Such information is much
more useful to the software developer than the traditional resource usage
functions, as it allows identifying the parts of a program that should be
optimized, because of their greater impact on the total cost of program
executions. We also report on our implementation of the proposed technique
using the CiaoPP program analysis framework, and provide some experimental
results. This paper is under consideration for acceptance in TPLP.Comment: Paper presented at the 32nd International Conference on Logic
Programming (ICLP 2016), New York City, USA, 16-21 October 2016, 22 pages,
LaTe
Surface mixing and biological activity in the four Eastern Boundary Upwelling Systems
Eastern Boundary Upwelling Systems (EBUS) are characterized by a high
productivity of plankton associated with large commercial fisheries, thus
playing key biological and socio-economical roles. The aim of this work is to
make a comparative study of these four upwelling systems focussing on their
surface stirring, using the Finite Size Lyapunov Exponents (FSLEs), and their
biological activity, based on satellite data. First, the spatial distribution
of horizontal mixing is analysed from time averages and from probability
density functions of FSLEs. Then we studied the temporal variability of surface
stirring focussing on the annual and seasonal cycle. There is a global negative
correlation between surface horizontal mixing and chlorophyll standing stocks
over the four areas. To try to better understand this inverse relationship, we
consider the vertical dimension by looking at the Ekman-transport and vertical
velocities. We suggest the possibility of a changing response of the
phytoplankton to sub/mesoscale turbulence, from a negative effect in the very
productive coastal areas to a positive one in the open ocean.Comment: 12 pages. NPG Special Issue on "Nonlinear processes in oceanic and
atmospheric flows". Open Access paper, available also at the publisher site:
http://www.nonlin-processes-geophys.net/16/557/2009
Towards Energy Consumption Verification via Static Analysis
In this paper we leverage an existing general framework for resource usage
verification and specialize it for verifying energy consumption specifications
of embedded programs. Such specifications can include both lower and upper
bounds on energy usage, and they can express intervals within which energy
usage is to be certified to be within such bounds. The bounds of the intervals
can be given in general as functions on input data sizes. Our verification
system can prove whether such energy usage specifications are met or not. It
can also infer the particular conditions under which the specifications hold.
To this end, these conditions are also expressed as intervals of functions of
input data sizes, such that a given specification can be proved for some
intervals but disproved for others. The specifications themselves can also
include preconditions expressing intervals for input data sizes. We report on a
prototype implementation of our approach within the CiaoPP system for the XC
language and XS1-L architecture, and illustrate with an example how embedded
software developers can use this tool, and in particular for determining values
for program parameters that ensure meeting a given energy budget while
minimizing the loss in quality of service.Comment: Presented at HIP3ES, 2015 (arXiv: 1501.03064
An Approach to Static Performance Guarantees for Programs with Run-time Checks
Instrumenting programs for performing run-time checking of properties, such
as regular shapes, is a common and useful technique that helps programmers
detect incorrect program behaviors. This is specially true in dynamic languages
such as Prolog. However, such run-time checks inevitably introduce run-time
overhead (in execution time, memory, energy, etc.). Several approaches have
been proposed for reducing such overhead, such as eliminating the checks that
can statically be proved to always succeed, and/or optimizing the way in which
the (remaining) checks are performed. However, there are cases in which it is
not possible to remove all checks statically (e.g., open libraries which must
check their interfaces, complex properties, unknown code, etc.) and in which,
even after optimizations, these remaining checks still may introduce an
unacceptable level of overhead. It is thus important for programmers to be able
to determine the additional cost due to the run-time checks and compare it to
some notion of admissible cost. The common practice used for estimating
run-time checking overhead is profiling, which is not exhaustive by nature.
Instead, we propose a method that uses static analysis to estimate such
overhead, with the advantage that the estimations are functions parameterized
by input data sizes. Unlike profiling, this approach can provide guarantees for
all possible execution traces, and allows assessing how the overhead grows as
the size of the input grows. Our method also extends an existing assertion
verification framework to express "admissible" overheads, and statically and
automatically checks whether the instrumented program conforms with such
specifications. Finally, we present an experimental evaluation of our approach
that suggests that our method is feasible and promising.Comment: 15 pages, 3 tables; submitted to ICLP'18, accepted as technical
communicatio
Recommended from our members
Detection of enteric parasite DNA in household and bed dust samples: potential for infection transmission.
BACKGROUND: Enteric parasites are transmitted in households but few studies have sampled inside households for parasites and none have used sensitive molecular methods. METHODS: We collected bed and living room dust samples from households of children participating in a clinical trial of anthelmintic treatment in rural coastal Ecuador. Dust was examined for presence of DNA specific for 11 enteric parasites (Ascaris lumbricoides, Trichuris trichiura, Ancylostoma duodenale, Necator americanus, Strongyloides stercoralis, Toxocara canis and T. cati, Giardia lamblia, Blastocystis hominis, Cryptosporidium spp., and Entamoeba histolytica) by quantitative PCR (qPCR). RESULTS: Of the 38 households sampled, 37 had positive dust for at least one parasite and up to 8 parasites were detected in single samples. Positivity was greatest for B. hominis (79% of household samples) indicating a high level of environmental fecal contamination. Dust positivity rates for individual pathogens were: S. stercoralis (52%), A. lumbricoides (39%), G. lamblia (39%), Toxocara spp. (42%), hookworm (18%) and T. trichiura (8%). DNA for Cryptosporidium spp. and E. histolytica was not detected. Bed dust was more frequently positive than floor samples for all parasites detected. Positivity for A. lumbricoides DNA in bed (adjusted OR: 10.0, 95% CI: 2.0-50.1) but not floor dust (adjusted OR: 3.6, 95% CI: 0.3-37.9) was significantly associated with active infections in children. CONCLUSIONS: To our knowledge, this is the first use of qPCR on environmental samples to detect a wide range of enteric pathogen DNA. Our results indicate widespread contamination of households with parasite DNA and raise the possibility that beds, under conditions of overcrowding in a humid tropical setting, may be a source of transmission
Implications of a Sub-Threshold Resonance for Stellar Beryllium Depletion
Abundance measurements of the light elements lithium, beryllium, and boron
are playing an increasingly important role in the study of stellar physics.
Because these elements are easily destroyed in stars at temperatures 2--4
million K, the abundances in the surface convective zone are diagnostics of the
star's internal workings. Standard stellar models cannot explain depletion
patterns observed in low mass stars, and so are not accounting for all the
relevant physical processes. These processes have important implications for
stellar evolution and primordial lithium production in big bang
nucleosynthesis. Because beryllium is destroyed at slightly higher temperatures
than lithium, observations of both light elements can differentiate between the
various proposed depletion mechanisms. Unfortunately, the reaction rate for the
main destruction channel, 9Be(p,alpha)6Li, is uncertain. A level in the
compound nucleus 10B is only 25.7 keV below the reaction's energetic threshold.
The angular momentum and parity of this level are not well known; current
estimates indicate that the resonance entrance channel is either s- or d-wave.
We show that an s-wave resonance can easily increase the reaction rate by an
order of magnitude at temperatures of approximately 4 million K. Observations
of sub-solar mass stars can constrain the strength of the resonance, as can
experimental measurements at lab energies lower than 30 keV.Comment: 9 pages, 1 ps figure, uses AASTeX macros and epsfig.sty. Reference
added, typos corrected. To appear in ApJ, 10 March 199
- …