596,064 research outputs found
Analyzing program analyses
We want to prove that a static analysis of a given program is complete, namely, no imprecision arises when asking some query on the program behavior in the concrete (i.e., for its concrete semantics) or in the abstract (i.e., for its abstract interpretation). Completeness proofs are therefore useful to assign confidence to alarms raised by static analyses. We introduce the completeness class of an abstraction as the set of all programs for which the abstraction is complete. Our first result shows that for any nontrivial abstraction, its completeness class is not recursively enumerable. We then introduce a stratified deductive system a2A to prove the completeness of program analyses over an abstract domain A. We prove the soundness of the deductive system. We observe that the only sources of incompleteness are assignments and Boolean tests \u2014 unlikely a common belief in static analysis, joins do not induce incompleteness. The first layer of this proof system is generic, abstraction-agnostic, and it deals with the standard constructs for program composition, that is, sequential composition, branching and guarded iteration. The second layer is instead abstraction-specific: the designer of an abstract domain A provides conditions for completeness in A of assignments and Boolean tests which have to be checked by a suitable static analysis or assumed in the completeness proof as hypotheses. We instantiate the second layer of this proof system first with a generic nonrelational abstraction in order to provide a sound rule for the completeness of assignments. Orthogonally, we instantiate it to the numerical abstract domains of Intervals and Octagons, providing necessary and sufficient conditions for the completeness of their Boolean tests and of assignments for Octagons
Manual of phosphoric acid fuel cell stack three-dimensional model and computer program
A detailed distributed mathematical model of phosphoric acid fuel cell stack have been developed, with the FORTRAN computer program, for analyzing the temperature distribution in the stack and the associated current density distribution on the cell plates. Energy, mass, and electrochemical analyses in the stack were combined to develop the model. Several reasonable assumptions were made to solve this mathematical model by means of the finite differences numerical method
Development of a computational aero/fluids analysis system
The Computational Aero/Fluids Analysis System (AFAS) provides the analytical capability to perform state-of-the-art computational analyses in two difficult fluid dynamics disciplines associated with the Space Shuttle program. This system provides the analysis tools and techniques for rapidly and efficiently accessing, analyzing, and reformulating the large and expanding external aerodynamic data base while also providing tools for complex fluid flow analyses of the SSME engine components. Both of these fluid flow disciplines, external aerodynamics and internal gasdynamics, required this capability to ensure that MSFC can respond in a timely manner as problems are encountered and operational changes are made in the Space Shuttle
Analyzing Repeated Measures Marginal Models on Sample Surveys with Resampling Methods
Packaged statistical software for analyzing categorical, repeated measures marginal models on sample survey data with binary covariates does not appear to be available. Consequently, this report describes a customized SAS program which accomplishes such an analysis on survey data with jackknifed replicate weights for which the primary sampling unit information has been suppressed for respondent confidentiality. First, the program employs the Macro Language and the Output Delivery System (ODS) to estimate the means and covariances of indicator variables for the response variables, taking the design into account. Then, it uses PROC CATMOD and ODS, ignoring the survey design, to obtain the design matrix and hypothesis test specifications. Finally, it enters these results into another run of CATMOD, which performs automated direct input of the survey design specifications and accomplishes the appropriate analysis. This customized SAS program can be employed, with minor editing, to analyze general categorical, repeated measures marginal models on sample surveys with replicate weights. Finally, the results of our analysis accounting for the survey design are compared to the results of two alternate analyses of the same data. This comparison confirms that such alternate analyses, which do not properly account for the design, do not produce useful results.
VARTOOLS: A Program for Analyzing Astronomical Time-Series Data
This paper describes the VARTOOLS program, which is an open-source
command-line utility, written in C, for analyzing astronomical time-series
data, especially light curves. The program provides a general-purpose set of
tools for processing light curves including signal identification, filtering,
light curve manipulation, time conversions, and modeling and simulating light
curves. Some of the routines implemented include the Generalized Lomb-Scargle
periodogram, the Box-Least Squares transit search routine, the Analysis of
Variance periodogram, the Discrete Fourier Transform including the CLEAN
algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and
non-linear optimization of analytic functions including support for Markov
Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or
simulating time-correlated noise, and the TFA and SYSREM filtering algorithms,
among others. A mechanism is also provided for incorporating a user's own
compiled processing routines into the program. VARTOOLS is designed especially
for batch processing of light curves, including built-in support for parallel
processing, making it useful for large time-domain surveys such as searches for
transiting planets. Several examples are provided to illustrate the use of the
program.Comment: 83 pages, 5 figures, accepted for publication in Astronomy and
Computing, code available at
http://www.astro.princeton.edu/~jhartman/vartools.htm
Analyzing social experiments as implemented: evidence from the HighScope Perry Preschool Program
Social experiments are powerful sources of information about the effectiveness of interventions. In practice, initial randomization plans are almost always compromised. Multiple hypotheses are frequently tested. "Significant" effects are often reported with p-values that do not account for preliminary screening from a large candidate pool of possible effects. This paper develops tools for analyzing data from experiments as they are actually implemented. We apply these tools to analyze the influential HighScope Perry Preschool Program. The Perry program was a social experiment that provided preschool education and home visits to disadvantaged children during their preschool years. It was evaluated by the method of random assignment. Both treatments and controls have been followed from age 3 through age 40. Previous analyses of the Perry data assume that the planned randomization protocol was implemented. In fact, as in many social experiments, the intended randomization protocol was compromised. Accounting for compromised randomization, multiple-hypothesis testing, and small sample sizes, we find statistically significant and economically important program effects for both males and females. We also examine the representativeness of the Perry study. Download appendix
- …