270 research outputs found
Does geography play a role in takeovers? Theory and Finnish micro-level evidence
This study explores domestic inter-regional merger flows. Theoretical considerations based on monitoring are developed. The empirical part of the study is based on the comprehensive public data on domestic mergers and acquisitions that is matched to the micro-level data sources maintained by Statistics Finland in order to obtain several variables that help to characterize the companies involved. The Finnish evidence reveals that geographical closeness matters a great deal for inter-regional merger flows. This means that a great number of domestic mergers occur within narrowly defined regions. Domestic merger flows substantially reinforce the core-periphery dimension. The most important finding from matched data is that the strong ability by an acquiring company to monitor the target (measured by the knowledge embodied in human capital) is able to support mergers that occur across distant locations, other things being equal. Geographical closeness and proximity across industries are not related, based on the Finnish evidence.
Does Geography Play a Role in Domestic Takeovers? Theory and Finnish Micro-level Evidence
This paper explores domestic mergers and acquisitions (M&As) from the regional perspective. The Finnish firm-level evidence reveals that geographical closeness matters a lot for M&As within a single country. Thus, a great number of domestic M&As occur within narrowly defined regions. Interestingly, domestic M&As reinforce the core-periphery dimension. The results from matched firm-level data show that the strong ability by an acquiring company to monitor the target (measured by the knowledge embodied in human capital) is able to support M&As that occur across distant locations.mergers, acquisitions, monitoring, agglomeration
Strange kinetics: conflict between density and trajectory description
We study a process of anomalous diffusion, based on intermittent velocity
fluctuations, and we show that its scaling depends on whether we observe the
motion of many independent trajectories or that of a Liouville-like equation
driven density. The reason for this discrepancy seems to be that the
Liouville-like equation is unable to reproduce the multi-scaling properties
emerging from trajectory dynamics. We argue that this conflict between density
and trajectory might help us to define the uncertain border between dynamics
and thermodynamics, and that between quantum and classical physics as well.Comment: submitted to Chemical Physic
Error Estimation of Bathymetric Grid Models Derived from Historic and Contemporary Data Sets
The past century has seen remarkable advances in technologies associated with positioning and the measurement of depth. Lead lines have given way to single beam echo sounders, which in turn are being replaced by multibeam sonars and other means of remotely and rapidly collecting dense bathymetric datasets. Sextants were replaced by radio navigation, then transit satellite, GPS and now differential GPS. With each new advance comes tremendous improvement in the accuracy and resolution of the data we collect. Given these changes and given the vastness of the ocean areas we must map, the charts we produce are mainly compilations of multiple data sets collected over many years and representing a range of technologies. Yet despite our knowledge that the accuracy of the various technologies differs, our compilations have traditionally treated each sounding with equal weight. We address these issues in the context of generating regularly spaced grids containing bathymetric values. Gridded products are required for a number of earth sciences studies and for generating the grid we are often forced to use a complex interpolation scheme due to the sparseness and irregularity of the input data points. Consequently, we are faced with the difficult task of assessing the confidence that we can assign to the final grid product, a task that is not usually addressed in most bathymetric compilations. Traditionally the hydrographic community has considered each sounding equally accurate and there has been no error evaluation of the bathymetric end product. This has important implications for use of the gridded bathymetry, especially when it is used for generating further scientific interpretations. In this paper we approach the problem of assessing the confidence of the final bathymetry gridded product via a direct-simulation Monte Carlo method. We start with a small subset of data from the International Bathymetric Chart of the Arctic Ocean (IBCAO) grid model [Jakobsson et al., 2000]. This grid is compiled from a mixture of data sources ranging from single beam soundings with available metadata, to spot soundings with no available metadata, to digitized contours; the test dataset shows examples of all of these types. From this database, we assign a priori error variances based on available meta-data, and when this is not available, based on a worst-case scenario in an essentially heuristic manner. We then generate a number of synthetic datasets by randomly perturbing the base data using normally distributed random variates, scaled according to the predicted error model. These datasets are next re-gridded using the same methodology as the original product, generating a set of plausible grid models of the regional bathymetry that we can use for standard deviation estimates. Finally, we repeat the entire random estimation process and analyze each runâs standard deviation grids in order to examine sampling bias and standard error in the predictions. The final products of the estimation are a collection of standard deviation grids, which we combine with the source data density in order to create a grid that contains information about the bathymetric modelâs reliability
Nonextensive statistical mechanics and central limit theorems I - Convolution of independent random variables and q-product
In this article we review the standard versions of the Central and of the
Levy-Gnedenko Limit Theorems, and illustrate their application to the
convolution of independent random variables associated with the distribution
known as q-Gaussian. This distribution emerges upon extremisation of the
nonadditive entropy, basis of nonextensive statistical mechanics. It has a
finite variance for q 5/3. We exhibit that,
in the case of (standard) independence, the q-Gaussian has either the Gaussian
(if q 5/3) as its attractor
in probability space. Moreover, we review a generalisation of the product, the
q-product, which plays a central role in the approach of the specially
correlated variables emerging within the nonextensive theory.Comment: 13 pages, 4 figures. To appear in the Proceedings of the conference
CTNEXT07, Complexity, Metastability and Nonextensivity, Catania, Italy, 1-5
July 2007, Eds. S. Abe, H.J. Herrmann, P. Quarati, A. Rapisarda and C.
Tsallis (American Institute of Physics, 2008) in pres
Estimating the Probability of a Rare Event Over a Finite Time Horizon
We study an approximation for the zero-variance change of measure to estimate the probability of a rare event in a continuous-time Markov chain. The rare event occurs when the chain reaches a given set of states before some fixed time limit. The jump rates of the chain are expressed as functions of a rarity parameter in a way that the probability of the rare event goes to zero when the rarity parameter goes to zero, and the behavior of our estimators is studied in this asymptotic regime. After giving a general expression for the zero-variance change of measure in this situation, we develop an approximation of it via a power series and show that this approximation provides a bounded relative error when the rarity parameter goes to zero. We illustrate the performance of our approximation on small numerical examples of highly reliableMarkovian systems. We compare it to a previously proposed heuristic that combines forcing with balanced failure biaising. We also exhibit the exact zero-variance change of measure for these examples and compare it with these two approximations
Reliability calculation using randomization for Markovian fault-tolerant computing systems
The randomization technique for computing transient probabilities of Markov processes is presented. The technique is applied to a Markov process model of a simplified fault tolerant computer system for illustrative purposes. It is applicable to much larger and more complex models. Transient state probabilities are computed, from which reliabilities are derived. An accelerated version of the randomization algorithm is developed which exploits ''stiffness' of the models to gain increased efficiency. A great advantage of the randomization approach is that it easily allows probabilities and reliabilities to be computed to any predetermined accuracy
- âŠ