8,407 research outputs found
Determining the dimension of iterative Hessian transformation
The central mean subspace (CMS) and iterative Hessian transformation (IHT)
have been introduced recently for dimension reduction when the conditional mean
is of interest. Suppose that X is a vector-valued predictor and Y is a scalar
response. The basic problem is to find a lower-dimensional predictor \eta^TX
such that E(Y|X)=E(Y|\eta^TX). The CMS defines the inferential object for this
problem and IHT provides an estimating procedure. Compared with other methods,
IHT requires fewer assumptions and has been shown to perform well when the
additional assumptions required by those methods fail. In this paper we give an
asymptotic analysis of IHT and provide stepwise asymptotic hypothesis tests to
determine the dimension of the CMS, as estimated by IHT. Here, the original IHT
method has been modified to be invariant under location and scale
transformations. To provide empirical support for our asymptotic results, we
will present a series of simulation studies. These agree well with the theory.
The method is applied to analyze an ozone data set.Comment: Published at http://dx.doi.org/10.1214/009053604000000661 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Detection of radioactive material entering national ports: A Bayesian approach to radiation portal data
Given the potential for illicit nuclear material being used for terrorism,
most ports now inspect a large number of goods entering national borders for
radioactive cargo. The U.S. Department of Homeland Security is moving toward
one hundred percent inspection of all containers entering the U.S. at various
ports of entry for nuclear material. We propose a Bayesian classification
approach for the real-time data collected by the inline Polyvinyl Toluene
radiation portal monitors. We study the computational and asymptotic properties
of the proposed method and demonstrate its efficacy in simulations. Given data
available to the authorities, it should be feasible to implement this approach
in practice.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS334 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Entanglement transformation with no classical communication
We present an optimal scheme to realize the transformations between single
copies of two bipartite entangled states without classical communication
between the sharing parties. The scheme achieves the upper bound for the
success probabilities [PRA 63, 022301 (2001), PRL 83, 1455 (1999)] of
generating maximally entangled states if applied to entanglement concentration.
Such strategy also dispenses with the interaction with an ancilla system in the
implementation. We also show that classical communications are indispensable in
realizing the deterministic transformations of a single bipartite entangled
state. With a finite number of identical pairs of two entangled bosons, on the
other hand, we can realize the deterministic transformation to any target
entangled state of equal or less Schmidt rank through an extension of the
scheme.Comment: published versio
A Poisson Regression Examination of the Relationship between Website Traffic and Search Engine Queries
A new area of research involves the use of Google data, which has been normalized and scaled to predict economic activity. This new source of data holds both many advantages as well as disadvantages, which are discussed through the use of daily and weekly data. Daily and weekly data are employed to show the effect of aggregation as it pertains to Google data, which can lead to contradictory findings. In this paper, Poisson regressions are used to explore the relationship between the online traffic to a specific website and the search volumes for certain keyword search queries, along with the rankings of that specific website for those queries. The purpose of this paper is to point out the benefits and the pitfalls of a potential new source of data that lacks transparency in regards to the original level data, which is due to the normalization and scaling procedure utilized by Google.Poisson Regression, Search Engine, Google Insights, Aggregation, Normalization Effects, Scaling Effects
What if pulsars are born as strange stars?
The possibility and the implications of the idea, that pulsars are born as
strange stars, are explored. Strange stars are very likely to have atmospheres
with typical mass of but bare polar caps almost
throughout their lifetimes, if they are produced during supernova explosions. A
direct consequence of the bare polar cap is that the binding energies of both
positively and negatively charged particles at the bare quark surface are
nearly infinity, so that the vacuum polar gap sparking scenario as proposed by
Ruderman & Sutherland should operate above the cap, regardless of the sense of
the magnetic pole with respect to the rotational pole. Heat can not accumulate
on the polar cap region due to the large thermal conductivity on the bare quark
surface. We test this ``bare polar cap strange star'' (BPCSS) idea with the
present broad band emission data of pulsars, and propose several possible
criteria to distinguish BPCSSs from neutron stars.Comment: 31 pages in Latex. Accepted by AstroParticle Physic
Gapless formation in the condensed color-flavor locked quark matter : a model-independent treatment
The electric/color neutral solution and the critical conditions for gapless
formation are investigated in the condensed color-flavor locked matter.
We point out that there exist no longer gapless modes for down-strange quark
pairing while the gapless phenomenon for up-strange one is dominated in the
condensed environment. In a model-independent way, the phase transition
to the resulting gapless phase is found to be of first-order. The novel phase
structure implies that the chromomagnetic instability happens in the
previous-predicted gapless phase might be removed at least partly.Comment: 2 figure
Setting Fees in Competing Double Auction Marketplaces: An Equilibrium Analysis
In this paper, we analyse competing double auction marketplaces that vie for traders and need to set appropriate fees to make a profit. Specifically, we show how competing marketplaces should set their fees by analysing the equilibrium behaviour of two competing marketplaces. In doing so, we focus on two different types of market fees: registration fees charged to traders when they enter the marketplace, and profit fees charged to traders when they make transactions. In more detail, given the market fees, we first derive equations to calculate the marketplaces' expected profits. Then we analyse the equilibrium charging behaviour of marketplaces in two different cases: where competing marketplaces can only charge the same type of fees and where competing marketplaces can charge different types of fees. This analysis provides insights which can be used to guide the charging behaviour of competing marketplaces. We also analyse whether two marketplaces can co-exist in equilibrium. We find that, when both marketplaces are limited to charging the same type of fees, traders will eventually converge to one marketplace. However, when different types of fees are allowed, traders may converge to different marketplaces (i.e. multiple marketplaces can co-exist)
- âŠ