815 research outputs found

    Hostile Share Acquisitions and Corporate Governance: A Framework for Evaluating Antitakeover Activities

    Get PDF
    In recent years, there has been a significant increase in the number of hostile share acquisitions of American businesses. The authors examine the validity of the various defensive measures employed by target companies to defeat or deter a hostile takeover bid. They argue that antitakeover activity should not be viewed as a separate subset of legal analysis; rather, it should be analyzed according to four traditional principles of corporate governance: (1) the discretion afforded corporate management by the business judgment rule; (2) the prohibition against discriminating between members of the same class of shareholders; (3) the prohibition against shifting control from the shareholders to the board of directors for actions reserved by statute to the shareholders; and (4) the prohibition against shifting control from a majority to a minority of shareholders for decisions reserved by statute to the majority. Moreover, the authors assert that even if a court uses these principles of corporate goverance as the basis for its decision, the court\u27s analysis is still incomplete if it focuses only on the target board\u27s initial decision to resist a hostile share acquisition. Rather, a court must undertake a two-step analysis, whereby it looks first at the target board\u27s initial decision to resist the hostile takeover, and second, to the means employed by the target board to effectuate that decision

    Management Buyouts: Creating or Appropriating Shareholder Wealth?

    Get PDF
    The name of the game in corporate America today is leverage.Whether through leveraged buyouts\u27 or leveraged recapitalizations, many of the United States\u27 largest corporations are rapidly trading equity capital for debt.\u27 This trend began only a few years ago when a small group of financial entrepreneurs, which included Carl Icahn, T.Boone Pickens, Asher Edelman,\u27 Irwin Jacobs, and Ronald Perelman, found that they could finance large stock purchases of major corporations through the use of high-yield ( junk ) bonds\u27 leading to either an acquisition of the target or its forced restructuring. The general goal of these financiers was to force a reconciliation between what they perceived as low stock prices and corporate assets of far greater potential value. Their efforts have been tremendously profitable. The corporate targets of these hostile share acquisitions, however,did not sit idly by and wait to have their shares gobbled up. The defenses they erected are now. famous because of their frequent use and colorful names: the Pac-Man defense, the scorched earth defense, shark repellents and poison pills. \u27 While these defenses proved to be an initial deterrent to hostile acquisitions, more creative financing techniques and other offensive weapons have rendered these defenses something of a Maginot Line. Target managements, searching for away to protect their shareholders, their jobs, or both, increasingly have taken the approach of fighting fire with fire-that is, using leverage and redeployment of assets in an attempt to create for themselves the same profits sought by the hostile bidder. The present-day management buyout developed primarily as a defensive response to the attacks of the financial entrepreneurs and other acquisition hungry companies. Top executives who became the equity holders in the private companies that followed buyouts generally have found this new defense as enormously profitable as the comparable offensive purchases of the financiers who initiated the first round of lever-aged stock acquisitions. Likewise, the leveraged recapitalization can be viewed largely as management\u27s attempt to effect the same reconciliation of values between stock prices and corporate assets by which a hostile bidder seeks to profit, while keeping the company independent with ownership continuing in the hands of the public shareholders.Here too, however, management will often grab a slice of the equity pie as an incentive booster in the course of revamping the corporation\u27s capital structure. It appears that like buyouts, top executives find the leveraged recapitalization quite profitable

    Uncertainty modelling in multi-criteria analysis of water safety measures

    Get PDF
    Water utilities must assess risks and make decisions on safety measures in order to obtain a safe and sustainable drinking water supply. The World Health Organization emphasises preparation of Water Safety Plans, in which risk ranking by means of risk matrices with discretised probability and consequence scales is commonly used. Risk ranking enables prioritisation of risks but there is currently no common and structured way of performing uncertainty analysis and using risk ranking for evaluating and comparing water safety measures. To enable a proper prioritisation of safety measures and an efficient use of available resources for risk reduction, two alternative models linking risk ranking and multi-criteria decision analysis (MCDA) are presented and evaluated. The two models specifically enable uncertainty modelling in MCDA and they differ in terms of how uncertainties in risk levels are considered. The need of formal handling of risk and uncertainty in MCDA is emphasised in the literature and the suggested models provide innovations that are not dependent on the application domain. In the case study application presented here, possible safety measures are evaluated based on the benefit of estimated risk reduction, the cost of implementation and the probability of not achieving an acceptable risk level. Additional criteria such as environmental impact and consumer trust may also be included when applying the models. The case study shows how safety measures can be ranked based on preference scores or cost-effectiveness and how measures not reducing the risk enough can be identified and disqualified. Furthermore, the probability of each safety measure being ranked highest can be calculated. The two models provide a stepwise procedure for prioritising safety measures and enable a formalised handling of uncertainties in input data and results

    Failures to disagree are essential for environmental science to effectively influence policy development

    Get PDF
    While environmental science, and ecology in particular, is working to provide better understanding to base sustainable decisions on, the way scientific understanding is developed can at times be detrimental to this cause. Locked-in debates are often unnecessarily polarised and can compromise any common goals of the opposing camps. The present paper is inspired by a resolved debate from an unrelated field of psychology where Nobel laureate David Kahneman and Garry Klein turned what seemed to be a locked-in debate into a constructive process for their fields. The present paper is also motivated by previous discourses regarding the role of thresholds in natural systems for management and governance, but its scope of analysis targets the scientific process within complex social-ecological systems in general. We identified four features of environmental science that appear to predispose for locked-in debates: (1) The strongly context-dependent behaviour of ecological systems. (2) The dominant role of single hypothesis testing. (3) The high prominence given to theory demonstration compared investigation. (4) The effect of urgent demands to inform and steer policy. This fertile ground is further cultivated by human psychological aspects as well as the structure of funding and publication systems

    Ecological Memory of Historical Contamination Influences the Response of Phytoplankton Communities

    Get PDF
    Ecological memory (EM) recognizes the importance of previous stress encounters in promoting community tolerance and thereby enhances ecosystem stability, provided that gained tolerances are preserved during non-stress periods. Drawing from this concept, we hypothesized that the recruitment of tolerant species can be facilitated by imposing an initial sorting process (conditioning) during the early stages of community assembly, which should result in higher production (biomass development and photosynthetic efficiency) and stable community composition. To test this, phytoplankton resting stages were germinated from lake sediments originating from two catchments that differed in contamination history: one impacted by long-term herbicides and pesticides exposures (historically contaminated lake) from an agricultural catchment compared to a low-impacted one (near-pristine lake) from a forested catchment. Conditioning was achieved by adding an herbicide (Isoproturon, which was commonly used in the catchment of the historically contaminated lake) during germination. Afterward, the communities obtained from germination were exposed to an increasing gradient of Isoproturon. As hypothesized, upon conditioning, the phytoplankton assemblages from the historically contaminated lake were able to rapidly restore photosynthetic efficiency (p > 0.01) and became structurally (community composition) more resistant to Isoproturon. The communities of the near-pristine lake did not yield these positive effects regardless of conditioning, supporting that EM was a unique attribute of the historically stressed ecosystem. Moreover, assemblages that displayed higher structural resistance concurrently yielded lower biomass, indicating that benefits of EM in increasing structural stability may trade-off with production. Our results clearly indicate that EM can foster ecosystem stability to a recurring stressor.publishedVersio

    GAMA/H-ATLAS: Common star formation rate indicators and their dependence on galaxy physical parameters

    Get PDF
    We compare common star formation rate (SFR) indicators in the local Universe in the Galaxy and Mass Assembly (GAMA) equatorial fields (∼160 deg2), using ultraviolet (UV) photometry from GALEX, far-infrared and sub-millimetre (sub-mm) photometry from Herschel Astrophysical Terahertz Large Area Survey, and Hα spectroscopy from the GAMA survey. With a high-quality sample of 745 galaxies (median redshift 〈z〉 = 0.08), we consider three SFR tracers: UV luminosity corrected for dust attenuation using the UV spectral slope β (SFRUV, corr), Hα line luminosity corrected for dust using the Balmer decrement (BD) (SFRH α, corr), and the combination of UV and infrared (IR) emission (SFRUV + IR). We demonstrate that SFRUV, corr can be reconciled with the other two tracers after applying attenuation corrections by calibrating Infrared excess (IRX; i.e. the IR to UV luminosity ratio) and attenuation in the Hα (derived from BD) against β. However, β, on its own, is very unlikely to be a reliable attenuation indicator. We find that attenuation correction factors depend on parameters such as stellar mass (M*), z and dust temperature (Tdust), but not on Hα equivalent width or Sérsic index. Due to the large scatter in the IRX versus β correlation, when compared to SFRUV + IR, the β-corrected SFRUV, corr exhibits systematic deviations as a function of IRX, BD and Tdust

    Galaxy And Mass Assembly (GAMA) : the large-scale structure of galaxies and comparison to mock universes

    Get PDF
    MA acknowledges funding from the University of St Andrews and the International Centre for Radio Astronomy Research. ASGR is supported by funding from a UWA Fellowship. PN acknowledges the support of the Royal Society through the award of a University Research Fellowship and the European Research Council, through receipt of a Starting Grant (DEGAS-259586). MJIB acknowledges the financial support of the Australian Research Council Future Fellowship 100100280. TMR acknowledges support from a European Research Council Starting Grant (DEGAS-259586).From a volume-limited sample of 45 542 galaxies and 6000 groups with z ≤ 0.213, we use an adapted minimal spanning tree algorithm to identify and classify large-scale structures within the Galaxy And Mass Assembly (GAMA) survey. Using galaxy groups, we identify 643 filaments across the three equatorial GAMA fields that span up to 200 h−1 Mpc in length, each with an average of eight groups within them. By analysing galaxies not belonging to groups, we identify a secondary population of smaller coherent structures composed entirely of galaxies, dubbed ‘tendrils’ that appear to link filaments together, or penetrate into voids, generally measuring around 10 h−1 Mpc in length and containing on average six galaxies. Finally, we are also able to identify a population of isolated void galaxies. By running this algorithm on GAMA mock galaxy catalogues, we compare the characteristics of large-scale structure between observed and mock data, finding that mock filaments reproduce observed ones extremely well. This provides a probe of higher order distribution statistics not captured by the popularly used two-point correlation function.Peer reviewe

    Galaxy And Mass Assembly (GAMA) : refining the local galaxy merger rate using morphological information

    Get PDF
    KRVS acknowledges the Science and Technology Facilities Council (STFC) for providing funding for this project, as well as the Government of Catalonia for a research travel grant (ref. 2010 BE-00268) to begin this project at the University of Nottingham. PN acknowledges the support of the Royal Society through the award of a University Research Fellowship and the European Research Council, through receipt of a Starting Grant (DEGAS-259586).We use the Galaxy And Mass Assembly (GAMA) survey to measure the local Universe mass-dependent merger fraction and merger rate using galaxy pairs and the CAS (concentration, asymmetry, and smoothness) structural method, which identifies highly asymmetric merger candidate galaxies. Our goals are to determine which types of mergers produce highly asymmetrical galaxies and to provide a new measurement of the local galaxy major merger rate. We examine galaxy pairs at stellar mass limits down to M* = 108 M⊙ with mass ratios of 4:1) the lower mass companion becomes highly asymmetric, whereas the larger galaxy is much less affected. The fraction of highly asymmetric paired galaxies which have a major merger companion is highest for the most massive galaxies and drops progressively with decreasing mass. We calculate that the mass-dependent major merger fraction is fairly constant at ∼1.3–2 per cent within 109.5 < M* < 1011.5 M⊙, and increases to ∼4 per cent at lower masses. When the observability time-scales are taken into consideration, the major merger rate is found to approximately triple over the mass range we consider. The total comoving volume major merger rate over the range 108.0 < M* < 1011.5 M⊙ is (1.2 ± 0.5) × 10−3 h370 Mpc−3 Gyr−1.Publisher PDFPeer reviewe

    Galaxy and Mass Assembly (GAMA): ugriz galaxy luminosity functions

    Get PDF
    Galaxy and Mass Assembly (GAMA) is a project to study galaxy formation and evolution, combining imaging data from ultraviolet to radio with spectroscopic data from the AAOmega spectrograph on the Anglo-Australian Telescope. Using data from Phase 1 of GAMA, taken over three observing seasons, and correcting for various minor sources of incompleteness, we calculate galaxy luminosity functions (LFs) and their evolution in the ugriz passbands. At low redshift, z < 0.1, we find that blue galaxies, defined according to a magnitude-dependent but non-evolving colour cut, are reasonably well fitted over a range of more than 10 magnitudes by simple Schechter functions in all bands. Red galaxies, and the combined blue plus red sample, require double power-law Schechter functions to fit a dip in their LF faintwards of the characteristic magnitude M* before a steepening faint end. This upturn is at least partly due to dust-reddened disc galaxies. We measure the evolution of the galaxy LF over the redshift range 0.002 < z < 0.5 both by using a parametric fit and by measuring binned LFs in redshift slices. The characteristic luminosity L* is found to increase with redshift in all bands, with red galaxies showing stronger luminosity evolution than blue galaxies. The comoving number density of blue galaxies increases with redshift, while that of red galaxies decreases, consistent with prevailing movement from blue cloud to red sequence. As well as being more numerous at higher redshift, blue galaxies also dominate the overall luminosity density beyond redshifts z≃ 0.2. At lower redshifts, the luminosity density is dominated by red galaxies in the riz bands, and by blue galaxies in u and g
    • …
    corecore