185 research outputs found
Exit, Voice and Loyalty from the Perspective of Hedge Funds Activism in Corporate Governance
This article discusses hedge funds activism based on Hirsch- manâs classic. It is argued that hedge funds do not create the loyalty concerns underlying the usual short-termism critique of their activism, because the arbiters of such activism are typically indexed funds, which cannot choose short-term exit. Nevertheless, the voice activated by hedge funds can be excessive for a particular company. Furthermore, this article claims that the short-termism debate cannot shed light on the desirability of hedge funds activism. Neither theory nor empirical evidence can tell whether hedge funds activism leads to short-termism or long-termism. The real issue with activism is a conflict of entrepreneurship, namely a conflict between the opposing views of the activists and the incumbent management regarding in how long an individual company should be profitable. Leaving the choice between these views to institutional investors is not efficient for every company at every point in time. Consequently, this article argues that regulation should enable individual companies to choose whether to curb hedge funds activism depending on what is efficient for them. The recent Europe- an experience reveals that loyalty shares enable such choice, even in the midstream, operating as dual-class shares in dis- guise. However, loyalty shares can often be introduced without institutional investorsâ consent. This outcome could be improved by allowing dual-class recapitalisations, instead of loyalty shares, but only with a majority of minority vote. This solution would screen for the companies for which temporarily curbing activism is efficient, and induce these companies to negotiate sunset clauses with institutional investors
Methodology of Law and Economics
Introduction
A chapter on the methodology of law and economics, i.e. the economic analysis of law, concerns the methodology of economics. The above quote (Becker 1976, 5) shows that economics should not be defined by its subject, but by its method (also Veljanovski 2007, 19). This method forms the core of our contribution.
We discuss several related issues. In his entry on methodology in the Encyclopedia of Law and Economics, Kerkmeester (2000) states that most legal economists follow a pragmatic, eclectic approach and that it is hard to fit them in a particular school. A review of the methodology of law and economics must therefore concentrate on the ideas which are shared by the vast majority of legal economists (Kerkmeester 2000, 383). De Geest defines the use of elements from different schools as the âintegrated paradigmâ, and the predominant approach to law and economics as the âmainstream approachâ (De Geest 1994, 459ff, Mackaay 1991, p. 1509).
In law and economics, the economic approach operates on two distinct levels. First, human choice is analyzed from an economic point of view. The predominant approach here is the rational choice theory, which we discuss in Section 2. The basic idea of this theory is that human behaviour is analyzed as if people are seeking to maximize their expected utility.
The second level of the economic approach is the goals which are attributed to the legal system. In Section 3, we discuss the concept of market failure, which in law and economics is regarded as the primary raison dâĂȘtre of law. Legal rules are analyzed as instruments to correct market failure, or at least to reduce its adverse consequences. We will briefly illustrate this idea by discussing, among others, competition law, tort law, patent law and consumer law as instruments to counter market power, negative externalities, collective goods and information asymmetry.
In Section 4, we discuss the Coase Theorem, which states that the allocation of legal entitlements between market players is irrelevant for efficiency when the parties can transact these entitlements costlessly. Given that transaction costs are positive in the real world, we also pay attention to their implications for regulation.
In Section 5, we discuss âbehavioural law and economics.â This relatively recent approach is based on insights from cognitive psychology, suggesting that people do not always act rationally. After reviewing the major findings in this field, we elaborate on the consequences for the more traditional approach of the rational choice theory.
In Section 6 we conclude
Illiquidity and financial crisis
This article analyzes the determinants of liquidity crises based on the dynamics of banking and finance under Knightian uncertainty. In this perspective, the facts of the global financial crisis seem to confirm Minsky's hypothesis of endogenous financial instability derived from Keynes's theory of liquidity and expectations. Conventional expectations allow overcoming uncertainty via the liquidity of secondary markets and, in turn, of banks' liabilities that are accepted as money. However, the failure of existing conventions drives the system into uncertainty-driven liquidity spirals, which are the more dangerous the more private money financial intermediaries have managed to create in the first place. Despite limited availability of data that can proxy for Knightian uncertainty, this approach to liquidity problems may explain better than others how a relatively small shock, such as the default of U.S. subprime mortgages, could trigger a worldwide systemic crisis
Featuring Control Power: Corporate Law and Economics Revisited
This dissertation reappraises the existing framework for economic analysis of corporate law. The standard approach to the legal foundations of corporate governance is based on the âlaw mattersâ thesis, according to which corporate law promotes separation of ownership and control by protecting minority shareholders from expropriation. This book takes a broader perspective on the economic and legal determinants of corporate governance. It shows that investor protection is a necessary, but not sufficient, legal condition for efficient separation of ownership and control. Supporting control powers vested in managers or controlling shareholders is at least as important as protecting investors from their abuse. Corporate law does not only matter in the last respect; it matters in both.
This result is derived by interpreting corporate governance based on three categories of private benefits of control. Corporate law affects corporate governance depending on its impact on each category of private benefits, and not just on those accounting for shareholder expropriation. Three major areas of corporate law are considered with this view. The first is the legal distribution of corporate powers. The second is the discipline of related-party transactions. The third is regulation of control transactions. The three areas are investigated comparatively in the US, the UK, Italy, Sweden, and the Netherlands. The investigation shows that, when corporate law is analyzed in this fashion, it explains the different patterns and performance of corporate governance. This account of corporate law is not only useful for understanding separation of ownership and control, but also for indicating how to improve its efficiency through legal intervention
Book review: âCorporate governance: New challenges and opportunitiesâ
This review covers the book titled âCORPORATE GOVERNANCE: NEW CHALLENGES AND OPPORTUNITIESâ, which was written by Alexander N. Kostyuk, Udo Braendle and Vincenzo Capizzi (Virtus Interpress, 2017, Hardcover, ISBN: 978-617-7309-00-9). The review shortly outlines the structure of the book, pays attention to itâs strong sides and issues that will be, by the reviewersâ point of view, most interesting for the reader
Quantification of Myocardial Blood Flow in Absolute Terms Using (82)Rb PET Imaging: The RUBY-10 Study.
OBJECTIVES: The purpose of this study was to compare myocardial blood flow (MBF) and myocardial flow reserve (MFR) estimates from rubidium-82 positron emission tomography ((82)Rb PET) data using 10 software packages (SPs) based on 8 tracer kinetic models.
BACKGROUND: It is unknown how MBF and MFR values from existing SPs agree for (82)Rb PET.
METHODS: Rest and stress (82)Rb PET scans of 48 patients with suspected or known coronary artery disease were analyzed in 10 centers. Each center used 1 of 10 SPs to analyze global and regional MBF using the different kinetic models implemented. Values were considered to agree if they simultaneously had an intraclass correlation coefficient >0.75 and a difference <20% of the median across all programs.
RESULTS: The most common model evaluated was the Ottawa Heart Institute 1-tissue compartment model (OHI-1-TCM). MBF values from 7 of 8 SPs implementing this model agreed best. Values from 2 other models (alternative 1-TCM and Axially distributed) also agreed well, with occasional differences. The MBF results from other models (e.g., 2-TCM and retention) were less in agreement with values from OHI-1-TCM.
CONCLUSIONS: SPs using the most common kinetic model-OHI-1-TCM-provided consistent results in measuring global and regional MBF values, suggesting that they may be used interchangeably to process data acquired with a common imaging protocol
Quality indicators for patients with traumatic brain injury in European intensive care units
Background: The aim of this study is to validate a previously published consensus-based quality indicator set for the management of patients with traumatic brain injury (TBI) at intensive care units (ICUs) in Europe and to study its potential for quality measur
Changing care pathways and between-center practice variations in intensive care for traumatic brain injury across Europe
Purpose: To describe ICU stay, selected management aspects, and outcome of Intensive Care Unit (ICU) patients with traumatic brain injury (TBI) in Europe, and to quantify variation across centers. Methods: This is a prospective observational multicenter study conducted across 18 countries in Europe and Israel. Admission characteristics, clinical data, and outcome were described at patient- and center levels. Between-center variation in the total ICU population was quantified with the median odds ratio (MOR), with correction for case-mix and random variation between centers. Results: A total of 2138 patients were admitted to the ICU, with median age of 49Â years; 36% of which were mild TBI (Glasgow Coma Scale; GCS 13â15). Within, 72Â h 636 (30%) were discharged and 128 (6%) died. Early deaths and long-stay patients (> 72Â h) had more severe injuries based on the GCS and neuroimaging characteristics, compared with short-stay patients. Long-stay patients received more monitoring and were treated at higher intensity, and experienced worse 6-month outcome compared to short-stay patients. Between-center variations were prominent in the proportion of short-stay patients (MOR = 2.3, p < 0.001), use of intracranial pressure (ICP) monitoring (MOR = 2.5, p < 0.001) and aggressive treatme
Machine learning algorithms performed no better than regression models for prognostication in traumatic brain injury
Objective: We aimed to explore the added value of common machine learning (ML) algorithms for prediction of outcome for moderate and severe traumatic brain injury. Study Design and Setting: We performed logistic regression (LR), lasso regression, and ridge regression with key baseline predictors in the IMPACT-II database (15 studies, n = 11,022). ML algorithms included support vector machines, random forests, gradient boosting machines, and artificial neural networks and were trained using the same predictors. To assess generalizability of predictions, we performed internal, internal-external, and external validation on the recent CENTER-TBI study (patients with Glasgow Coma Scale <13, n = 1,554). Both calibration (calibration slope/intercept) and discrimination (area under the curve) was quantified. Results: In the IMPACT-II database, 3,332/11,022 (30%) died and 5,233(48%) had unfavorable outcome (Glasgow Outcome Scale less than 4). In the CENTER-TBI study, 348/1,554(29%) died and 651(54%) had unfavorable outcome. Discrimination and calibration varied widely between the studies and less so between the studied algorithms. The mean area under the curve was 0.82 for mortality and 0.77 for unfavorable outcomes in the CENTER-TBI study. Conclusion: ML algorithms may not outperform traditional regression approaches in a low-dimensional setting for outcome prediction after moderate or severe traumatic brain injury. Similar to regression-based prediction models, ML algorithms should be rigorously validated to ensure applicability to new populations
Measurement of prompt hadron production ratios in collisions at 0.9 and 7 TeV
The charged-particle production ratios , , ,
, and are measured with the LHCb detector using of collisions delivered by the LHC at TeV and
at TeV. The measurements are performed as a
function of transverse momentum and pseudorapidity . The
production ratios are compared to the predictions of several Monte Carlo
generator settings, none of which are able to describe adequately all
observables. The ratio is also considered as a function of rapidity
loss, , and is used to constrain models of
baryon transport.Comment: Incorrect entries in Table 2 corrected. No consequences for rest of
pape
- âŠ