147,773 research outputs found
Characterization of the Benchmark Binary NLTT 33370
We report the confirmation of the binary nature of the nearby, very low-mass
system NLTT 33370 with adaptive optics imaging and present resolved
near-infrared photometry and integrated light optical and near-infrared
spectroscopy to characterize the system. VLT-NaCo and LBTI-LMIRCam images show
significant orbital motion between 2013 February and 2013 April. Optical
spectra reveal weak, gravity sensitive alkali lines and strong lithium 6708
Angstrom absorption that indicate the system is younger than field age.
VLT-SINFONI near-IR spectra also show weak, gravity sensitive features and
spectral morphology that is consistent with other young, very low-mass dwarfs.
We combine the constraints from all age diagnostics to estimate a system age of
~30-200 Myr. The 1.2-4.7 micron spectral energy distribution of the components
point toward T_eff=3200 +/- 500 K and T_eff=3100 +/- 500 K for NLTT 33370 A and
B, respectively. The observed spectra, derived temperatures, and estimated age
combine to constrain the component spectral types to the range M6-M8.
Evolutionary models predict masses of 113 +/- 8 M_Jup and 106 +/- 7 M_Jup from
the estimated luminosities of the components. KPNO-Phoenix spectra allow us to
estimate the systemic radial velocity of the binary. The Galactic kinematics of
NLTT 33370AB are broadly consistent with other young stars in the Solar
neighborhood. However, definitive membership in a young, kinematic group cannot
be assigned at this time and further follow-up observations are necessary to
fully constrain the system's kinematics. The proximity, age, and late-spectral
type of this binary make it very novel and an ideal target for rapid, complete
orbit determination. The system is one of only a few model calibration
benchmarks at young ages and very low-masses.Comment: 25 pages, 3 tables, 13 figures, accepted for publication in The
Astrophysical Journa
Characterization methods dedicated to nanometer-thick hBN layers
Hexagonal boron nitride (hBN) regains interest as a strategic component in
graphene engineering and in van der Waals heterostructures built with two
dimensional materials. It is crucial then, to handle reliable characterization
techniques capable to assess the quality of structural and electronic
properties of the hBN material used. We present here characterization
procedures based on optical spectroscopies, namely cathodoluminescence and
Raman, with the additional support of structural analysis conducted by
transmission electron microscopy. We show the capability of optical
spectroscopies to investigate and benchmark the optical and structural
properties of various hBN thin layers sources
Has U.S. Inflation Really Become Harder to Forecast?
Recently Stock and Watson (2007) showed that since the mid-1980s it has been hard for backward-looking Phillips curve models to improve on simple univariate models in forecasting U.S. inflation. While this indeed is the case when the benchmark is a causal autoregression, little change in forecast accuracy is detected when a noncausal autoregression is taken as the benchmark. In this note, we argue that a noncausal autoregression indeed provides a better characterization of U.S. inflation dynamics than the conventional causal autoregression and it is, therefore, the appropriate univariate benchmark model.Inflation forecast; Noncausal time series; Phillips curve
ShenZhen transportation system (SZTS): a novel big data benchmark suite
Data analytics is at the core of the supply chain for both products and services in modern economies and societies. Big data workloads, however, are placing unprecedented demands on computing technologies, calling for a deep understanding and characterization of these emerging workloads. In this paper, we propose ShenZhen Transportation System (SZTS), a novel big data Hadoop benchmark suite comprised of real-life transportation analysis applications with real-life input data sets from Shenzhen in China. SZTS uniquely focuses on a specific and real-life application domain whereas other existing Hadoop benchmark suites, such as HiBench and CloudRank-D, consist of generic algorithms with synthetic inputs. We perform a cross-layer workload characterization at the microarchitecture level, the operating system (OS) level, and the job level, revealing unique characteristics of SZTS compared to existing Hadoop benchmarks as well as general-purpose multi-core PARSEC benchmarks. We also study the sensitivity of workload behavior with respect to input data size, and we propose a methodology for identifying representative input data sets
Frontiers of the physics of dense plasmas and planetary interiors: experiments, theory, applications
Recent developments of dynamic x-ray characterization experiments of dense
matter are reviewed, with particular emphasis on conditions relevant to
interiors of terrestrial and gas giant planets. These studies include
characterization of compressed states of matter in light elements by x-ray
scattering and imaging of shocked iron by radiography. Several applications of
this work are examined. These include the structure of massive "Super Earth"
terrestrial planets around other stars, the 40 known extrasolar gas giants with
measured masses and radii, and Jupiter itself, which serves as the benchmark
for giant planets.Comment: Accepted to Physics of Plasmas special issue. Review from
HEDP/HEDLA-08, April 12-15, 200
Consumption Externalities and Capital Accumulation in an Overlapping Generations Economy
This paper extends the standard overlapping generations model of capital accumulation by introducing consumption externalities. It is assumed that each generation's felicity depends on the social level of benchmark consumption as well as on its own consumption. Since the benchmark consumption is represented by the average consumption of all agents, the contemporaneous consumption externalities are determined by both intragenerational and intergenerational interactions among the consumers. Given this setting, we show that even in a simple model with a logarithmic utility function, the presence of consumption externalities may significantly affect the dynamic behavior and steady-state characterization of the economy. We also reveal that the same conclusion holds in an endogenous growth model in which production externalities sustain continuing growth.overlapping generations, benchmark consumption, intergenerational externalities, intragenerational externalities
Optimal Competitive Auctions
We study the design of truthful auctions for selling identical items in
unlimited supply (e.g., digital goods) to n unit demand buyers. This classic
problem stands out from profit-maximizing auction design literature as it
requires no probabilistic assumptions on buyers' valuations and employs the
framework of competitive analysis. Our objective is to optimize the worst-case
performance of an auction, measured by the ratio between a given benchmark and
revenue generated by the auction.
We establish a sufficient and necessary condition that characterizes
competitive ratios for all monotone benchmarks. The characterization identifies
the worst-case distribution of instances and reveals intrinsic relations
between competitive ratios and benchmarks in the competitive analysis. With the
characterization at hand, we show optimal competitive auctions for two natural
benchmarks.
The most well-studied benchmark measures the
envy-free optimal revenue where at least two buyers win. Goldberg et al. [13]
showed a sequence of lower bounds on the competitive ratio for each number of
buyers n. They conjectured that all these bounds are tight. We show that
optimal competitive auctions match these bounds. Thus, we confirm the
conjecture and settle a central open problem in the design of digital goods
auctions. As one more application we examine another economically meaningful
benchmark, which measures the optimal revenue across all limited-supply Vickrey
auctions. We identify the optimal competitive ratios to be
for each number of buyers n, that is as
approaches infinity
An Extensible Benchmarking Infrastructure for Motion Planning Algorithms
Sampling-based planning algorithms are the most common probabilistically
complete algorithms and are widely used on many robot platforms. Within this
class of algorithms, many variants have been proposed over the last 20 years,
yet there is still no characterization of which algorithms are well-suited for
which classes of problems. This has motivated us to develop a benchmarking
infrastructure for motion planning algorithms. It consists of three main
components. First, we have created an extensive benchmarking software framework
that is included with the Open Motion Planning Library (OMPL), a C++ library
that contains implementations of many sampling-based algorithms. Second, we
have defined extensible formats for storing benchmark results. The formats are
fairly straightforward so that other planning libraries could easily produce
compatible output. Finally, we have created an interactive, versatile
visualization tool for compact presentation of collected benchmark data. The
tool and underlying database facilitate the analysis of performance across
benchmark problems and planners.Comment: Submitted to IEEE Robotics & Automation Magazine (Special Issue on
Replicable and Measurable Robotics Research), 201
- …