2,940 research outputs found
Nonparametric models of financial leverage decisions
This paper investigates the properties of nonparametric decision tree models in the analysis of financial leverage decisions. This approach presents two appealing features: the relationship between leverage ratios and the explanatory variables is not predetermined but is derived according to information provided by the data, and the models respect the bounded and fractional nature of leverage ratios. The analysis shows that tree models suggest relationships between explanatory variables and the relative amount of issued debt that parametric models fail to capture. Furthermore, the significant relationships found by tree models are in most cases in accordance with the effects predicted by the pecking-order theory. The results also show that two-part tree models can accommodate better the distinct effects of explanatory variables on the decision to issue debt and on the amount of debt issued by firms that do resort to debt.Capital structure, Fractional regression, Decision trees, Two-part models
An analysis of B_{d,s} mixing angles in presence of New Physics and an update of Bs -> K0* anti-K0*
We discuss a simple approach to measure the weak mixing angles phi_s and
phi_d of the Bs and Bd systems in the presence of New Physics. We present a new
expression that allows one to measure directly the New Physics mixing angles if
New Physics contributes significantly to the mixing only. We apply the method
to specific penguin-mediated B->PP, B->PV and B ->VV modes. We provide a very
stringent and simple bound on the direct CP asymmetries of all these modes, the
violation of which is a signal of New Physics in decay. Within the same
theoretical framework, an updated prediction for the branching ratio of Bs->K0*
anti-K0* is presented, which can be compared with a recent LHCb analysis.Comment: 11 pages, 3 figure
Is neglected heterogeneity really an issue in binary and fractional regression models? A simulation exercise for logit, probit and loglog models
In this paper we examine theoretically and by simulation whether or not unobserved heterogeneity independent of the included regressors is really an issue in logit, probit and loglog models with both binary and fractional data. We found that unobserved heterogeneity: (i) produces an attenuation bias in the estimation of regression coefficients; (ii) is innocuous for logit estimation of average sample partial effects, while in the probit and loglog cases there may be important biases in the estimation of those quantities; (iii) has much more destructive effects over the estimation of population partial effects; (iv) only for logit models does not affect substantially the prediction of outcomes; and (v) is innocuous for the size and consistency of Wald tests for the significance of observed regressors but, in small samples, reduces their power substantially.Binary models; fractional models; neglected heterogeneity; partial effects; prediction; wald tests.
"Cellsense" : design of a whole cell biosensor for biomedical applications
Cancer represents a public health problem worldwide with growing incidence as result of population aging, but also the
adoption of a cancer associated lifestyle.R1
The major problem of cancer is that most types are diagnosed on a later stage, reducing treatment effectiveness, so an early
diagnosis would considerably improve clinical outcome.R2 Cancer diagnosis, if based on molecular features, can be highly
specific and sensitive but very few biomarkers are available. Since most methodologies rely on known biomarkers to develop
molecular probes, discovery of unknown molecular features of diseased cells is very difficult.R3
SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is a procedure applied to perform an in vitro selection
from a random pool of nucleic acid sequences, called aptamers. These present specific binding to complex target mixtures,
due to their complex 3D structure, and high affinity to the ligand, comparable to monoclonal antibodies.
Cell-SELEX allows using whole living cells as targets to select aptamers that specifically recognize them. Diseased cells usually
present specific biomarkers, such as wild-type proteins specifically or differentially expressed, with aberrant post translational
modifications, potentially resulting from genetic lesions. Aptamer probes selected against cancer cells are able to identify these
molecular differences, discriminating between normal and tumor cells, but also cells at different disease stages or from different
patients. Cell-SELEX has the advantage that aptamer selection can be done without prior knowledge of target molecules.R3
In this work, cancer cells and fibroblasts are being used to obtain a reduced pool of sequences that specifically recognize
cancer cells. After studying this pool, their binding affinity and detection will be tested and optimized. Generated aptamers
can be further combined with nanotechnology to obtain a multivalent nanovector for cancer diagnosis
Asymptotic bias for GMM and GEL estimators with estimated nuisance parameters
This papers studies and compares the asymptotic bias of GMM and generalized empirical likelihood (GEL) estimators in the presence of estimated nuisance parameters. We consider cases in which the nuisance parameter is estimated from independent and identical samples. A simulation experiment is conducted for covariance structure models. Empirical likelihood offers much reduced mean and median bias, root mean squared error and mean absolute error, as compared with two-step GMM and other GEL methods. Both analytical and bootstrap bias-adjusted two-step GMM estimators are compared. Analytical bias-adjustment appears to be a serious competitor to bootstrap methods in terms of finite sample bias, root mean squared error and mean absolute error. Finite sample variance seems to be little affected.
Analysis and design of a drain water heat recovery storage unit based on PCM plates
© 2016. This version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/This paper is focused on the detailed analysis of a PCM plate heat storage unit with a particular configuration, looking for the maximum contact area with the fluid (water) and the minimum volume to be used in a real household application. In that sense, a numerical study of the thermal and fluid dynamic behaviour of the water flow and the PCM melting-solidification processes, together with the thermal behaviour of the solid elements of the unit, has been carried out. On the other hand, an experimental set-up has been designed and built to validate the numerical model and characterise the performance of the heat storage unit. The purpose of the numerical and experimental study is to present a series of results to describe the heat storage unit performance in function of the time. Thus, after a preliminary design study three different cases have been simulated and tested. A 7.2% of discrepancy between numerical results and experimental data has been evaluated for the heat transfer. The PCM heat storage unit designed is capable to store approx. 75% of the thermal energy from the previous process wasted water heat, and recover part of it to supply around 50% of the thermal energy required to heat up the next process.Peer ReviewedPostprint (author's final draft
Gauge and Yukawa mediated supersymmetry breaking in the triplet seesaw scenario
We propose a novel supersymmetric unified scenario of the triplet seesaw
mechanism where the exchange of the heavy triplets generates both neutrino
masses and soft supersymmetry breaking terms. Our framework is very predictive
since it relates neutrino mass parameters, lepton flavour violation in the
slepton sector, sparticle and Higgs spectra and electroweak symmetry breakdown.
The phenomenological viability and experimental signatures in lepton flavor
violating processes are discussed.Comment: 11 pages, 3 eps figs. Comments and references added. Final version to
be published in Phys. Rev. Let
National input-output table of Brazil
This paper reports on the basic information about the Brazilian input-output tables. First part presents the history of constructing I-O tables in Brazil, starting from the non-official table of 1959. The second part illustrates some basic features of the last table with the reference year of 2005. It also articulates the issues concerning the integration of the table into 2005 BRICS International Input-Output Table.Brazil; Input-Output; Tables
A lower bound in Nehari's theorem on the polydisc
By theorems of Ferguson and Lacey (d=2) and Lacey and Terwilleger (d>2),
Nehari's theorem is known to hold on the polydisc D^d for d>1, i.e., if H_\psi
is a bounded Hankel form on H^2(D^d) with analytic symbol \psi, then there is a
function \phi in L^\infty(\T^d) such that \psi is the Riesz projection of \phi.
A method proposed in Helson's last paper is used to show that the constant C_d
in the estimate \|\phi\|_\infty\le C_d \|H_\psi\| grows at least exponentially
with d; it follows that there is no analogue of Nehari's theorem on the
infinite-dimensional polydisc
- …