51,873 research outputs found
A study of the temporal changes recorded by ERTS and their geological significance
The temporal changes that are recorded by ERTS were evaluated for an area around Bathurst Inlet in the North West Territories. The seasons represented by the images included: early winter, spring, early summer, summer, and fall. Numerous surface characteristics (vegetation, drainage patterns, surface texture, lineament systems and topographic relief, etc.) were used to relate the change in observable features with the different seasons. It was found that the time of year when an observation is made has a strong control over the amount and type of information that can be derived by an experienced interpreter. It was concluded that a detailed study of temporal changes is an important part of any ERTS interpretation for geology
The Grow-Shrink strategy for learning Markov network structures constrained by context-specific independences
Markov networks are models for compactly representing complex probability
distributions. They are composed by a structure and a set of numerical weights.
The structure qualitatively describes independences in the distribution, which
can be exploited to factorize the distribution into a set of compact functions.
A key application for learning structures from data is to automatically
discover knowledge. In practice, structure learning algorithms focused on
"knowledge discovery" present a limitation: they use a coarse-grained
representation of the structure. As a result, this representation cannot
describe context-specific independences. Very recently, an algorithm called
CSPC was designed to overcome this limitation, but it has a high computational
complexity. This work tries to mitigate this downside presenting CSGS, an
algorithm that uses the Grow-Shrink strategy for reducing unnecessary
computations. On an empirical evaluation, the structures learned by CSGS
achieve competitive accuracies and lower computational complexity with respect
to those obtained by CSPC.Comment: 12 pages, and 8 figures. This works was presented in IBERAMIA 201
Landau-Pomeranchuk-Migdal resummation for dilepton production
We consider the thermal emission rate of dileptons from a QCD plasma in the
small invariant mass (Q^2 \sim \gs^2 T^2) but large energy (q^0 \gsim T)
range. We derive an integral equation which resums multiple scatterings to
include the LPM effect; it is valid at leading order in the coupling. Then we
recast it as a differential equation and show a simple algorithm for its
solution. We present results for dilepton rates at phenomenologically
interesting energies and invariant masses.Comment: 19 pages, 7 postscript figures, test program available at
http://www-spht.cea.fr/articles/T02/150/libLPM
5D Black Rings and 4D Black Holes
It has recently been shown that the M theory lift of a IIA 4D BPS Calabi-Yau
black hole is a 5D BPS black hole spinning at the center of a Taub-NUT-flux
geometries, and a certain linear relation between 4D and 5D BPS partition
functions was accordingly proposed. In the present work we fortify and enrich
this proposal by showing that the M-theory lift of the general 4D multi-black
hole geometry are 5D black rings in a Taub-NUT-flux geometry.Comment: 8 pages; version 2, with additional references and explanation
Parallel Metric Tree Embedding based on an Algebraic View on Moore-Bellman-Ford
A \emph{metric tree embedding} of expected \emph{stretch~}
maps a weighted -node graph to a weighted tree with such that, for all ,
and
. Such embeddings are highly useful for designing
fast approximation algorithms, as many hard problems are easy to solve on tree
instances. However, to date the best parallel -depth algorithm that achieves an asymptotically optimal expected stretch of
requires
work and a metric as input.
In this paper, we show how to achieve the same guarantees using
depth and
work, where and is an arbitrarily small constant.
Moreover, one may further reduce the work to at the expense of increasing the expected stretch to
.
Our main tool in deriving these parallel algorithms is an algebraic
characterization of a generalization of the classic Moore-Bellman-Ford
algorithm. We consider this framework, which subsumes a variety of previous
"Moore-Bellman-Ford-like" algorithms, to be of independent interest and discuss
it in depth. In our tree embedding algorithm, we leverage it for providing
efficient query access to an approximate metric that allows sampling the tree
using depth and work.
We illustrate the generality and versatility of our techniques by various
examples and a number of additional results
Fatigue testing a plurality of test specimens and method
Described is a fatigue testing apparatus for simultaneously subjecting a plurality of material test specimens to cyclical tension loading to determine the fatigue strength of the material. The fatigue testing apparatus includes a pulling head having cylinders defined therein which carry reciprocating pistons. The reciprocation of the pistons is determined by cyclical supplies of pressurized fluid to the cylinders. Piston rods extend from the pistons through the pulling head and are attachable to one end of the test specimens, the other end of the test specimens being attachable to a fixed base, causing test specimens attached between the piston rods and the base to be subjected to cyclical tension loading. Because all the cylinders share a common pressurized fluid supply, the breaking of a test specimen does not substantially affect the pressure of the fluid supplied to the other cylinders nor the tension applied to the other test specimens
Effect of aerobic capacity on Lower Body Negative Pressure (LBNP) tolerance in females
This investigation determined whether a relationship exists in females between: (1) aerobic capacity and Lower Body Negative Pressure (LBNP); and (2) aerobic capacity and change in LBNP tolerance induced by bed rest. Nine females, age 27-47 (34.6 plus or minus 6.0 (Mean plus or minus SD)), completed a treadmill-graded exercise test to establish aerobic capacity. A presyncopal-limited LBNP test was performed prior to and after 13 days of bed rest at a 6 deg head-down tilt. LBNP tolerance was quantified as: (1) the absolute level of negative pressure (NP) tolerated for greater than or equal to 60 sec; and (2) Luft's Cumulative Stress Index (CSI). Aerobic capacity was 33.3 plus or minus 5.0 mL/kg/min and ranged from 25.7 to 38.7. Bed rest was associated with a decrease in NP tolerance (-9.04 1.6 kPa(-67.8 plus or minus 12.0 mmHg) versus -7.7 1.1 kPa(-57.8 plus or minus 8.33 mmHg); p = 0.028) and in CSI (99.4 27.4 kPa min(745.7 plus or minus 205.4 mmHg min) versus 77.0 16.9 kPa min (577.3 plus or minus mmHg min); p = 0.008). The correlation between aerobic capacity and absolute NP or CSI pre-bed rest did not differ significantly from zero (r = -0.56, p = 0.11 for NP; and r = -0.52, p = 0.16 for CSI). Also, no significant correlation was observed between aerobic and pre- to post-rest change for absolute NP tolerance (r = -0.35, p = 0.35) or CSI (r = -0.32, p = 0.40). Therefore, a significant relationship does not exist between aerobic capacity and orthostatic function or change in orthostatic function induced by bed rest
Generating Explanatory Captions for Information Graphics
Graphical presentations can be used to communicate information in relational data sets succinctly and effectively. However, novel graphical presentations about numerous attributes and their relationships are often difficult to understand completely until explained. Automatically generated graphical presentations must therefore either be limited to simple, conventional ones, or risk incomprehensibility. One way of alleviating this problem is to design graphical presentation systems that can work in conjunction with a natural language generator to produce "explanatory captions." This paper presents three strategies for generating explanatory captions to accompany information graphics based on: (1) a representation of the structure of the graphical presentation (2) a framework for identifyingthe perceptual complexity of graphical elements, and (3) the structure of the data expressed in the graphic. We describe an implemented system and illustrate how it is used to generate explanatory cap..
Symbolic Algorithms for Language Equivalence and Kleene Algebra with Tests
We first propose algorithms for checking language equivalence of finite
automata over a large alphabet. We use symbolic automata, where the transition
function is compactly represented using a (multi-terminal) binary decision
diagrams (BDD). The key idea consists in computing a bisimulation by exploring
reachable pairs symbolically, so as to avoid redundancies. This idea can be
combined with already existing optimisations, and we show in particular a nice
integration with the disjoint sets forest data-structure from Hopcroft and
Karp's standard algorithm. Then we consider Kleene algebra with tests (KAT), an
algebraic theory that can be used for verification in various domains ranging
from compiler optimisation to network programming analysis. This theory is
decidable by reduction to language equivalence of automata on guarded strings,
a particular kind of automata that have exponentially large alphabets. We
propose several methods allowing to construct symbolic automata out of KAT
expressions, based either on Brzozowski's derivatives or standard automata
constructions. All in all, this results in efficient algorithms for deciding
equivalence of KAT expressions
Nondestructive testing techniques used in analysis of honeycomb structure bond strength
DOT /Driver-Displacement Oriented Transducer/, applicable to both lap shear type application and honeycomb sandwich structures, measures the displacement of the honeycomb composite face sheet. It incorporates an electromagnetic driver and a displacement measuring system into a single unit to provide noncontact bond strength measurements
- âŠ