193 research outputs found
Atomic power - its significance to the management of a relief valve company
Thesis (M.B.A.)--Boston Universit
Recommended from our members
Semi-annual report of the Department of Energy, Office of Environmental Management, Quality Assessment Program
This Quality Assessment Program (QAP) is designed to test the quality of the environmental measurements being reported to the Department of Energy by its contractors. Since 1976, real or synthetic environmental samples that have been prepared and thoroughly analyzed at the Environmental Measurements Laboratory (EML) have been distributed at first quarterly and then semi-annually to these contractors. Their results, which are returned to EML within 90 days, are compiled with EML`s results and are reported back to the participating contractors 30 days later. This report presents the results from the analysis of the 48th set of environmental quality assessment samples (QAP XLVIII) that were received on or before June 1, 1998
A Comprehensive Economic Stimulus for our Failing Economy
This paper presents a comprehensive plan to fix the ailing American economy, through a five-step approach. First, the Federal Reserve must continue to broaden the scope of monetary policy, by purchasing and selling long-term securities. Manipulating expectations through FOMC statements is another tool at the Federal Reserve’s disposal. Secondly, the government must enact fiscal stimulus to stabilize the economy in the short and medium runs, through investment in infrastructure projects, green technology, fusion technology, and science education. Additionally, the new fiscal policy must tackle the mortgage meltdown, which is weighing down the entire economy. Third, the regulatory system must be changed to reduce the likelihood of another financial collapse, starting with the nationalization of the ratings agencies. Ratings should be updated faster, with a numeric grading system rather than the pre-existing letter grades. Fourth, our globalized economy insures that a coordinated globalized response is necessary to recover. Global cooperation to reduce inflation and avoid protectionist policies is vital. Finally, the American bailout policy must be made clear, only giving bailouts to companies that are sound but financially strapped and those that are too big to fail
Theoretically Efficient Parallel Graph Algorithms Can Be Fast and Scalable
There has been significant recent interest in parallel graph processing due
to the need to quickly analyze the large graphs available today. Many graph
codes have been designed for distributed memory or external memory. However,
today even the largest publicly-available real-world graph (the Hyperlink Web
graph with over 3.5 billion vertices and 128 billion edges) can fit in the
memory of a single commodity multicore server. Nevertheless, most experimental
work in the literature report results on much smaller graphs, and the ones for
the Hyperlink graph use distributed or external memory. Therefore, it is
natural to ask whether we can efficiently solve a broad class of graph problems
on this graph in memory.
This paper shows that theoretically-efficient parallel graph algorithms can
scale to the largest publicly-available graphs using a single machine with a
terabyte of RAM, processing them in minutes. We give implementations of
theoretically-efficient parallel algorithms for 20 important graph problems. We
also present the optimizations and techniques that we used in our
implementations, which were crucial in enabling us to process these large
graphs quickly. We show that the running times of our implementations
outperform existing state-of-the-art implementations on the largest real-world
graphs. For many of the problems that we consider, this is the first time they
have been solved on graphs at this scale. We have made the implementations
developed in this work publicly-available as the Graph-Based Benchmark Suite
(GBBS).Comment: This is the full version of the paper appearing in the ACM Symposium
on Parallelism in Algorithms and Architectures (SPAA), 201
Multi-static, multi-frequency scattering from zooplankton
Abstract: Inversion of multi-frequency acoustic backscattering cm be used to estkate size-abundances of zooplankton, given a valid model for backscattering for the zooplankters. me physical properties of the scatterers, density and compressibility (or compressional-wave sound speed), are usually assigned fixed values in the scattering model.~ese properties wotid be of interest if they could be mew~~in~it~, e.g.to exm~e~hange$in liPid contents over seasons. Extension of currently-favored backscattering models to multi-static configurations looks promising as a method to directly measure these relevant physical properties simultaneously with size-abundance estimation
The Computational Complexity of the Lorentz Lattice Gas
The Lorentz lattice gas is studied from the perspective of computational
complexity theory. It is shown that using massive parallelism, particle
trajectories can be simulated in a time that scales logarithmically in the
length of the trajectory. This result characterizes the ``logical depth" of the
Lorentz lattice gas and allows us to compare it to other models in statistical
physics.Comment: 9 pages, LaTeX, to appear in J. Stat. Phy
Time-Lock Puzzles from Randomized Encodings
Time-lock puzzles are a mechanism for sending messages "to the future". A sender can quickly generate a puzzle with a solution s that remains hidden until a moderately large amount of time t has elapsed. The solution s should be hidden from any adversary that runs in time significantly less than t, including resourceful parallel adversaries with polynomially many processors.
While the notion of time-lock puzzles has been around for 22 years, there has only been a single candidate proposed. Fifteen years ago, Rivest, Shamir and Wagner suggested a beautiful candidate time-lock puzzle based on the assumption that exponentiation modulo an RSA integer is an "inherently sequential" computation.
We show that various flavors of randomized encodings give rise to time-lock puzzles of varying strengths, whose security can be shown assuming the mere existence of non-parallelizing languages, which are languages that require circuits of depth at least t to decide, in the worst-case. The existence of such languages is necessary for the existence of time-lock puzzles.
We instantiate the construction with different randomized encodings from the literature, where increasingly better efficiency is obtained based on increasingly stronger cryptographic assumptions, ranging from one-way functions to indistinguishability obfuscation. We also observe that time-lock puzzles imply one-way functions, and thus the reliance on some cryptographic assumption is necessary.
Finally, generalizing the above, we construct other types of puzzles such as proofs of work from randomized encodings and a suitable worst-case hardness assumption (that is necessary for such puzzles to exist)
Aliskiren, enalapril, or aliskiren and enalapril in heart failure
BACKGROUND
Among patients with chronic heart failure, angiotensin-converting–enzyme (ACE)
inhibitors reduce mortality and hospitalization, but the role of a renin inhibitor in
such patients is unknown. We compared the ACE inhibitor enalapril with the renin
inhibitor aliskiren (to test superiority or at least noninferiority) and with the combination
of the two treatments (to test superiority) in patients with heart failure
and a reduced ejection fraction.
METHODS
After a single-blind run-in period, we assigned patients, in a double-blind fashion,
to one of three groups: 2336 patients were assigned to receive enalapril at a dose
of 5 or 10 mg twice daily, 2340 to receive aliskiren at a dose of 300 mg once
daily, and 2340 to receive both treatments (combination therapy). The primary
composite outcome was death from cardiovascular causes or hospitalization for
heart failure.
RESULTS
After a median follow-up of 36.6 months, the primary outcome occurred in 770
patients (32.9%) in the combination-therapy group and in 808 (34.6%) in the
enalapril group (hazard ratio, 0.93; 95% confidence interval [CI], 0.85 to 1.03). The
primary outcome occurred in 791 patients (33.8%) in the aliskiren group (hazard
ratio vs. enalapril, 0.99; 95% CI, 0.90 to 1.10); the prespecified test for noninferiority
was not met. There was a higher risk of hypotensive symptoms in the combination-therapy
group than in the enalapril group (13.8% vs. 11.0%, P=0.005), as
well as higher risks of an elevated serum creatinine level (4.1% vs. 2.7%, P=0.009)
and an elevated potassium level (17.1% vs. 12.5%, P<0.001).
CONCLUSIONS
In patients with chronic heart failure, the addition of aliskiren to enalapril led to
more adverse events without an increase in benefit. Noninferiority was not shown
for aliskiren as compared with enalapri
The Computational Complexity of Generating Random Fractals
In this paper we examine a number of models that generate random fractals.
The models are studied using the tools of computational complexity theory from
the perspective of parallel computation. Diffusion limited aggregation and
several widely used algorithms for equilibrating the Ising model are shown to
be highly sequential; it is unlikely they can be simulated efficiently in
parallel. This is in contrast to Mandelbrot percolation that can be simulated
in constant parallel time. Our research helps shed light on the intrinsic
complexity of these models relative to each other and to different growth
processes that have been recently studied using complexity theory. In addition,
the results may serve as a guide to simulation physics.Comment: 28 pages, LATEX, 8 Postscript figures available from
[email protected]
- …