2,511 research outputs found

    Internalisation Theory and outward direct investment by emerging market multinationals

    No full text
    The rise of multinational enterprises from emerging countries (EMNEs) poses an important test for theories of the multinational enterprise such as internalisation theory. It has been contended that new phenomena need new theory. This paper proposes that internalisation theory is appropriate to analyse EMNEs. This paper examines four approaches to EMNEs—international investment strategies, domestic market imperfections, international corporate networks and domestic institutions—and three case studies—Chinese outward FDI, Indian foreign acquisitions and investment in tax havens—to show the enduring relevance and predictive power of internalisation theory. This analysis encompasses many other approaches as special cases of internalisation theory. The use of internalisation theory to analyse EMNEs is to be commended, not only because of its theoretical inclusivity, but also because it has the ability to connect and to explain seemingly desperate phenomena

    QCD Coherence and the Top Quark Asymmetry

    Full text link
    Coherent QCD radiation in the hadroproduction of top quark pairs leads to a forward--backward asymmetry that grows more negative with increasing transverse momentum of the pair. This feature is present in Monte Carlo event generators with coherent parton showering, even though the production process is treated at leading order and has no intrinsic asymmetry before showering. In addition, depending on the treatment of recoils, showering can produce a positive contribution to the inclusive asymmetry. We explain the origin of these features, compare them in fixed-order calculations and the Herwig++, Pythia and Sherpa event generators, and discuss their implications.Comment: 28 pages, 11 figures, 2 table

    Using gamma+jets Production to Calibrate the Standard Model Z(nunu)+jets Background to New Physics Processes at the LHC

    Full text link
    The irreducible background from Z(nunu)+jets, to beyond the Standard Model searches at the LHC, can be calibrated using gamma+jets data. The method utilises the fact that at high vector boson pT, the event kinematics are the same for the two processes and the cross sections differ mainly due to the boson-quark couplings. The method relies on a precise prediction from theory of the Z/gamma cross section ratio at high pT, which should be insensitive to effects from full event simulation. We study the Z/gamma ratio for final states involving 1, 2 and 3 hadronic jets, using both the leading-order parton shower Monte Carlo program Pythia8 and a leading-order matrix element program Gambos. This enables us both to understand the underlying parton dynamics in both processes, and to quantify the theoretical systematic uncertainties in the ratio predictions. Using a typical set of experimental cuts, we estimate the net theoretical uncertainty in the ratio to be of order 7%, when obtained from a Monte Carlo program using multiparton matrix-elements for the hard process. Uncertainties associated with full event simulation are found to be small. The results indicate that an overall accuracy of the method, excluding statistical errors, of order 10% should be possible.Comment: 22 pages, 14 figures; Accepted for publication by JHE

    Do logarithmic proximity measures outperform plain ones in graph clustering?

    Full text link
    We consider a number of graph kernels and proximity measures including commute time kernel, regularized Laplacian kernel, heat kernel, exponential diffusion kernel (also called "communicability"), etc., and the corresponding distances as applied to clustering nodes in random graphs and several well-known datasets. The model of generating random graphs involves edge probabilities for the pairs of nodes that belong to the same class or different predefined classes of nodes. It turns out that in most cases, logarithmic measures (i.e., measures resulting after taking logarithm of the proximities) perform better while distinguishing underlying classes than the "plain" measures. A comparison in terms of reject curves of inter-class and intra-class distances confirms this conclusion. A similar conclusion can be made for several well-known datasets. A possible origin of this effect is that most kernels have a multiplicative nature, while the nature of distances used in cluster algorithms is an additive one (cf. the triangle inequality). The logarithmic transformation is a tool to transform the first nature to the second one. Moreover, some distances corresponding to the logarithmic measures possess a meaningful cutpoint additivity property. In our experiments, the leader is usually the logarithmic Communicability measure. However, we indicate some more complicated cases in which other measures, typically, Communicability and plain Walk, can be the winners.Comment: 11 pages, 5 tables, 9 figures. Accepted for publication in the Proceedings of 6th International Conference on Network Analysis, May 26-28, 2016, Nizhny Novgorod, Russi

    Novel statistical approaches for non-normal censored immunological data: analysis of cytokine and gene expression data

    Get PDF
    Background: For several immune-mediated diseases, immunological analysis will become more complex in the future with datasets in which cytokine and gene expression data play a major role. These data have certain characteristics that require sophisticated statistical analysis such as strategies for non-normal distribution and censoring. Additionally, complex and multiple immunological relationships need to be adjusted for potential confounding and interaction effects. Objective: We aimed to introduce and apply different methods for statistical analysis of non-normal censored cytokine and gene expression data. Furthermore, we assessed the performance and accuracy of a novel regression approach in order to allow adjusting for covariates and potential confounding. Methods: For non-normally distributed censored data traditional means such as the Kaplan-Meier method or the generalized Wilcoxon test are described. In order to adjust for covariates the novel approach named Tobit regression on ranks was introduced. Its performance and accuracy for analysis of non-normal censored cytokine/gene expression data was evaluated by a simulation study and a statistical experiment applying permutation and bootstrapping. Results: If adjustment for covariates is not necessary traditional statistical methods are adequate for non-normal censored data. Comparable with these and appropriate if additional adjustment is required, Tobit regression on ranks is a valid method. Its power, type-I error rate and accuracy were comparable to the classical Tobit regression. Conclusion: Non-normally distributed censored immunological data require appropriate statistical methods. Tobit regression on ranks meets these requirements and can be used for adjustment for covariates and potential confounding in large and complex immunological datasets

    A retrospective and agenda for future research on Chinese outward foreign direct investment

    Get PDF
    Our original paper “The determinants of Chinese Outward Foreign Direct Investment” was the first theoretically based empirical analysis of the phenomenon. It utilised internalisation theory to show that Chinese state-owned firms reacted to home country market imperfections to surmount barriers to foreign entry arising from naivety and the lack of obvious ownership advantages, leveraging institutional factors including favourable policy stimuli. This special theory explained outward foreign direct investment (OFDI) but provided surprises. These included the apparent appetite for risk evinced by these early investors, causing us to conjecture that domestic market imperfections, particularly in the domestic capital market, might be responsible. The article stimulated a massive subsequent, largely successful, research effort on emerging country multinationals. In this Retrospective article we review some of the main strands of research that ensued, for the insight they offer for the theme of our commentary. Our theme is that theoretical development can only come through embracing yet more challenging, different, and new contexts, and we make suggestions for future research directions

    Chiral U(1) flavor models and flavored Higgs doublets: the top FB asymmetry and the Wjj

    Full text link
    We present U(1) flavor models for leptophobic Z' with flavor dependent couplings to the right-handed up-type quarks in the Standard Model, which can accommodate the recent data on the top forward-backward (FB) asymmetry and the dijet resonance associated with a W boson reported by CDF Collaboration. Such flavor-dependent leptophobic charge assignments generally require extra chiral fermions for anomaly cancellation. Also the chiral nature of U(1)' flavor symmetry calls for new U(1)'-charged Higgs doublets in order for the SM fermions to have realistic renormalizable Yukawa couplings. The stringent constraints from the top FB asymmetry at the Tevatron and the same sign top pair production at the LHC can be evaded due to contributions of the extra Higgs doublets. We also show that the extension could realize cold dark matter candidates.Comment: 40 pages, 10 figures, added 1 figure and extended discussion, accepted for publication in JHE

    When Is Visual Information Used to Control Locomotion When Descending a Kerb?

    Get PDF
    YesBackground: Descending kerbs during locomotion involves the regulation of appropriate foot placement before the kerb-edge and foot clearance over it. It also involves the modulation of gait output to ensure the body-mass is safely and smoothly lowered to the new level. Previous research has shown that vision is used in such adaptive gait tasks for feedforward planning, with vision from the lower visual field (lvf) used for online updating. The present study determined when lvf information is used to control/update locomotion when stepping from a kerb. Methodology/Principal Findings: 12 young adults stepped down a kerb during ongoing gait. Force sensitive resistors (attached to participants' feet) interfaced with an high-speed PDLC 'smart glass' sheet, allowed the lvf to be unpredictably occluded at either heel-contact of the penultimate or final step before the kerb-edge up to contact with the lower level. Analysis focussed on determining changes in foot placement distance before the kerb-edge, clearance over it, and in kinematic measures of the step down. Lvf occlusion from the instant of final step contact had no significant effect on any dependant variable (p>0.09). Occlusion of the lvf from the instant of penultimate step contact had a significant effect on foot clearance and on several kinematic measures, with findings consistent with participants becoming uncertain regarding relative horizontal location of the kerb-edge. Conclusion/Significance: These findings suggest concurrent feedback of the lower limb, kerb-edge, and/or floor area immediately in front/below the kerb is not used when stepping from a kerb during ongoing gait. Instead heel-clearance and pre-landing-kinematic parameters are determined/planned using lvf information acquired in the penultimate step during the approach to the kerb-edge, with information related to foot placement before the kerb-edge being the most salient
    • …
    corecore