6,062 research outputs found

    An Empirical Analysis of Takeover Predictions in the UK: Application of Artificial Neural Networks and Logistic Regression

    Get PDF
    Merged with duplicate record 10026.1/656 on 27.03.2017 by CS (TIS)This study undertakes an empirical analysis of takeover predictions in the UK. The objectives of this research are twofold. First, whether it is possible to predict or identity takeover targets before they receive any takeover bid. Second, to test whether it is possible to improve prediction outcome by extending firm specific characteristics such as corporate governance variables as well as employing a different technique that has started becoming an established analytical tool by its extensive application in corporate finance field. In order to test the first objective, Logistic Regression (LR) and Artificial Neural Networks (ANNs) have been applied as modelling techniques for predicting target companies in the UK. Hence by applying ANNs in takeover predictions, their prediction ability in target classification is tested and results are compared to the LR results. For the second objective, in addition to the company financial variables, non-financial characteristics, corporate governance characteristics, of companies are employed. For the fist time, ANNs are applied to corporate governance variables in takeover prediction purposes. In the final section, two groups of variables are combined to test whether the previous outcomes of financial and non-financial variables could be improved. However the results suggest that predicting takeovers, by employing publicly available information that is already reflected in the share price of the companies, is not likely at least by employing current techniques of LR and ANNs. These results are consistent with the semi-strong form of the efficient market hypothesis

    A Bootstrapping architecture for time expression recognition in unlabelled corpora via syntactic-semantic patterns

    Get PDF
    In this paper we describe a semi-supervised approach to the extraction of time expression mentions in large unlabelled corpora based on bootstrapping. Bootstrapping techniques rely on a relatively small amount of initial human-supplied examples (termed “seeds”) of the type of entity or concept to be learned, in order to capture an initial set of patterns or rules from the unlabelled text that extract the supplied data. In turn, the learned patterns are employed to find new potential examples, and the process is repeated to grow the set of patterns and (optionally) the set of examples. In order to prevent the learned pattern set from producing spurious results, it becomes essential to implement a ranking and selection procedure to filter out “bad” patterns and, depending on the case, new candidate examples. Therefore, the type of patterns employed (knowledge representation) as well as the ranking and selection procedure are paramount to the quality of the results. We present a complete bootstrapping algorithm for recognition of time expressions, with a special emphasis on the type of patterns used (a combination of semantic and morpho- syntantic elements) and the ranking and selection criteria. Bootstrap- ping techniques have been previously employed with limited success for several NLP problems, both of recognition and classification, but their application to time expression recognition is, to the best of our knowledge, novel. As of this writing, the described architecture is in the final stages of implementation, with experimention and evalution being already underway.Postprint (published version

    How and why communications industry suppliers get “squeezed out” by outsourcing: cases, impact and the next phases

    Get PDF
    The communications systems,terminals,and service, industries, have undergone over the past ten years a significant technological internal evolution and external revolution at customer end (such as shifting to IP, wireless 3G and LTE evolutions, new terminals, broadband...). Very little management research has studied their survivability irrespective of changes in demand volumes, due to technological sourcing and outsourcing practices driven by other global industries serving as predators in view of the huge business potential of communications products and services. These other industries include computing software, semiconductor and contract manufacturing industries, many of with roots in emerging countries. This paper analyzes the implications of using in-sourced genuine non-proprietary open communications standards , of the wider use of in-sourced /purchased technologies ,and of outsourced contract manufacturing . The methodology used is equilibrium analyses from case analysis data. They show a trend towards active or passive knowledge leakage. Three specific areas will be mentioned as examples .The paper also shows the processes how eventually those industries in a later cycle bounce back.Communications industry; Communications industry suppliers; Business processes; Intellectual property; Technical competence; Customer bases

    D7.4 Third evaluation report. Evaluation of PANACEA v3 and produced resources

    Get PDF
    D7.4 reports on the evaluation of the different components integrated in the PANACEA third cycle of development as well as the final validation of the platform itself. All validation and evaluation experiments follow the evaluation criteria already described in D7.1. The main goal of WP7 tasks was to test the (technical) functionalities and capabilities of the middleware that allows the integration of the various resource-creation components into an interoperable distributed environment (WP3) and to evaluate the quality of the components developed in WP5 and WP6. The content of this deliverable is thus complementary to D8.2 and D8.3 that tackle advantages and usability in industrial scenarios. It has to be noted that the PANACEA third cycle of development addressed many components that are still under research. The main goal for this evaluation cycle thus is to assess the methods experimented with and their potentials for becoming actual production tools to be exploited outside research labs. For most of the technologies, an attempt was made to re-interpret standard evaluation measures, usually in terms of accuracy, precision and recall, as measures related to a reduction of costs (time and human resources) in the current practices based on the manual production of resources. In order to do so, the different tools had to be tuned and adapted to maximize precision and for some tools the possibility to offer confidence measures that could allow a separation of the resources that still needed manual revision has been attempted. Furthermore, the extension to other languages in addition to English, also a PANACEA objective, has been evaluated. The main facts about the evaluation results are now summarized

    The Effect of M&A on Employee Performance : An empirical study on post-MBA employee performance in private Norwegian target companies

    Get PDF
    This paper analyses post-M&A employee performance for private Norwegian target companies using accounting data between 2007 and 2016. We have created an algorithm which identifies ownership changes in firms from accounting data. Our model is based on the Cobb-Douglas productivity function to measure firm productivity, and utilizes Propensity Score Matching (PSM) to control for confounding variables. Additionally, we research if the effect of M&A are different based on labor size or sector. As a robustness test we use Nearest Neighbour matching combined with a Difference-In-Difference (DD) analysis to control for possible bias in the PSM analyses. Our results conclude that M&As do not have an effect on employee performance in Norwegian private companies. Furthermore, results indicate a negative effect on firm performance post-M&A. We neither find any reliable differences on employee performance from labor size nor sector. However, we find that the firm performance for the companies with the largest labor force, in the retail industry, and in the remaining sectors are negatively affected post-M&A. The DD analysis mostly support the PSM findings on employee performance and strengthens the validity of our findings. However, we cannot exclude potential confounding of the firm performance outcome variable.nhhma

    The COMPASS Experiment at CERN

    Get PDF
    The COMPASS experiment makes use of the CERN SPS high-intensitymuon and hadron beams for the investigation of the nucleon spin structure and the spectroscopy of hadrons. One or more outgoing particles are detected in coincidence with the incoming muon or hadron. A large polarized target inside a superconducting solenoid is used for the measurements with the muon beam. Outgoing particles are detected by a two-stage, large angle and large momentum range spectrometer. The setup is built using several types of tracking detectors, according to the expected incident rate, required space resolution and the solid angle to be covered. Particle identification is achieved using a RICH counter and both hadron and electromagnetic calorimeters. The setup has been successfully operated from 2002 onwards using a muon beam. Data with a hadron beam were also collected in 2004. This article describes the main features and performances of the spectrometer in 2004; a short summary of the 2006 upgrade is also given.Comment: 84 papes, 74 figure

    Identification and Estimation of Intra-Firm and Industry Competition via Ownership Change

    Get PDF
    This paper proposes and empirically implements a framework for analyzing industry competition and the degree of joint profit maximization of merging firms in differentiated product industries. Using pre- and post-merger industry data, I am able to separate merging firms' intra-organizational pricing considerations from industry pricing considerations. The insights of the paper shed light on a long-standing debate in the theoretical literature about the consequences of organizational integration. Moreover, I propose a novel approach to directly estimate industry conduct that relies on ownership changes and input price variation. I apply my framework using data from the ready-to-eat cereal industry, covering the 1993 Post-Nabisco merger. My results show an increasing degree of joint profit maximization of the merged entities over the first two years after the merger, eventually leading to almost full maximization of joint profits. I find that between 14.3 and 25.6 percent of industry markups can be attributed to cooperative industry behavior, while the remaining markup is due to product differentiation of multi-product firms
    • 

    corecore