826 research outputs found
Capturing Regulatory Reality: Stiglerâs \u3ci\u3eThe Theory of Economic Regulation\u3c/i\u3e
This paper offers a retrospective assessment of economist George Stiglerâs classic article, The Theory of Economic Regulation. Stigler argued that regulation is a product that, just like any other product, is produced in a market, and that it can be acquired from the governmental âmarketplaceâ by business firms to serve their private interests and create barriers to entry for potential competitors. He challenged the idea that regulation arises solely to serve the public interest and demonstrated that important political advantages held by businesses can contribute to industry capture of the regulatory process. Although his argument was largely based on the theoretical framework he developed, Stigler also illustrated his insights with empirical evidence from state-level regulatory schemes, including trucking regulation and occupational licensing. In this paper, we re-examine Stiglerâs argument and analysis more than forty years later. Despite the great value of Stiglerâs work in illuminating the problem of regulatory capture, his influential article nevertheless did exaggerate the power of business over regulators, as he suggested the existence of nearly an iron law of business control that clearly does not exist. He also confusingly conflated elected legislators with more independent agency bureaucrats, failed to rule out the public interest theory of regulation, and relied in part on unrealistic assumptions about the political economy of regulation. Notwithstanding these shortcomings, Stiglerâs ground-breaking theory holds enduring value to both scholars and policymakers, and his innovative use of economic principles and empirical analysis provides a much-needed template for the further study of regulation and regulatory institutions even today
Capturing Regulatory Reality: Stiglerâs \u3ci\u3eThe Theory of Economic Regulation\u3c/i\u3e
This paper offers a retrospective assessment of economist George Stiglerâs classic article, The Theory of Economic Regulation. Stigler argued that regulation is a product that, just like any other product, is produced in a market, and that it can be acquired from the governmental âmarketplaceâ by business firms to serve their private interests and create barriers to entry for potential competitors. He challenged the idea that regulation arises solely to serve the public interest and demonstrated that important political advantages held by businesses can contribute to industry capture of the regulatory process. Although his argument was largely based on the theoretical framework he developed, Stigler also illustrated his insights with empirical evidence from state-level regulatory schemes, including trucking regulation and occupational licensing. In this paper, we re-examine Stiglerâs argument and analysis more than forty years later. Despite the great value of Stiglerâs work in illuminating the problem of regulatory capture, his influential article nevertheless did exaggerate the power of business over regulators, as he suggested the existence of nearly an iron law of business control that clearly does not exist. He also confusingly conflated elected legislators with more independent agency bureaucrats, failed to rule out the public interest theory of regulation, and relied in part on unrealistic assumptions about the political economy of regulation. Notwithstanding these shortcomings, Stiglerâs ground-breaking theory holds enduring value to both scholars and policymakers, and his innovative use of economic principles and empirical analysis provides a much-needed template for the further study of regulation and regulatory institutions even today
Recommended from our members
Structured to Fail? Explaining Regulatory Performance under Competing Mandates
Following each of three major disasters--the financial crisis, the Gulf oil spill, and the nuclear meltdown in Japan--policymakers responded by overhauling the associated regulatory infrastructure. In each case, the response was intended to sharpen the regulator's focus, predicated on the widely held view that asking an agency to satisfy both regulatory and non-regulatory roles induces organizational conflict and impedes performance. In this dissertation, I put this commonly accepted belief about agency structure to the test by analyzing the behavior of regulators also assigned significant, non-regulatory functions. Incorporating data on a broad set of U.S. federal agencies, I first establish that the conventional wisdom holds some truth: Regulators that combine purposes do not perform as well. Even so, through a mix of statistical analyses, formal modeling, and an in-depth study of the former U.S. offshore oil and gas regulator, the Minerals Management Service, I show that assigning regulatory and non-regulatory functions to one agency can, in some cases, still be better than dividing them between agencies. I demonstrate that while the goal ambiguity and conflict introduced by combining roles does impact behavior, overemphasizing this issue misses several important factors affecting regulators tasked with non-regulatory aims. These factors explain both how regulators operate when charged with achieving other goals and why these multiple-purpose mandates persist. First, although the goals may conflict, the underlying tasks supporting these divergent purposes may still require extensive coordination. Second, even within agencies, introducing features that encourage separation between the affected groups can allow regulators to manage ambiguity, but these efforts can simultaneously exacerbate difficulties in achieving synergies generated through Professor Daniel Paul Carpenter Christopher Michael Carrigan close contact. Third, even when the conditions for conflict are present, political and public preferencesâand not just internal factorsâcan play important roles in shaping agency priorities. Fourth, broader social, industry, and environmental shifts can attenuate or accentuate the organizational tension that exists between managing goal ambiguity while encouraging underlying coordination. In sum, only by recognizing roles for a diverse set of forcesâoperations, organization, politics, and environmentâcan the existence, behavior, and performance of regulatory agencies that balance non-regulatory mandates be logically explained
Tracer-based metabolic NMR-based flux analysis in a leukaemia cell line
High levels of reactive oxygen species (ROS) have a profound impact on acute myeloid leukaemia cells and can be used to specifically target these cells with novel therapies. We have previously shown how the combination of two redeployed drugs, the contraceptive steroid medroxyprogesterone and the lipidâregulating drug bezafibrate exert antiâleukaemic effects by producing ROS. Here we report a (13)Câtracerâbased NMR metabolic study to understand how these drugs work in K562 leukaemia cells. Our study shows that [1,2â(13)C]glucose is incorporated into ribose sugars, indicating activity in oxidative and nonâoxidative pentose phosphate pathways alongside lactate production. There is little label incorporation into the tricarboxylic acid cycle from glucose, but much greater incorporation arises from the use of [3â(13)C]glutamine. The combined medroxyprogesterone and bezafibrate treatment decreases label incorporation from both glucose and glutamine into αâketoglutarate and increased that for succinate, which is consistent with ROSâmediated conversion of αâketoglutarate to succinate. Most interestingly, this combined treatment drastically reduced the production of several pyrimidine synthesis intermediates
Consumer\u27s Guide to Regulatory Impact Analysis: Ten Tips for Being an Informed Policymaker
Regulatory impact analyses (RIAs) weigh the benefits of regulations against the burdens they impose and are invaluable tools for informing decision makers.We offer 10 tips for nonspecialist policymakers and interested stakeholders who will be reading RIAs as consumers. Core problem: Determine whether the RIA identifies the core problem (compelling public need) the regulation is intended to address. Alternatives: Look for an objective, policy-neutral evaluation of the relative merits of reasonable alternatives. Baseline: Check whether the RIA presents a reasonable âcounterfactualâ against which benefits and costs are measured. Increments: Evaluate whether totals and averages obscure relevant distinctions and trade-offs. Uncertainty: Recognize that all estimates involve uncertainty, and ask what effect key assumptions, data, and models have on those estimates. Transparency: Look for transparency and objectivity of analytical inputs. Benefits: Examine how projected benefits relate to stated objectives. Costs: Understand what costs are included. Distribution: Consider how benefits and costs are distributed. Symmetrical treatment: Ensure that benefits and costs are presented symmetrically
Consumer\u27s Guide to Regulatory Impact Analysis: Ten Tips for Being an Informed Policymaker
Regulatory impact analyses (RIAs) weigh the benefits of regulations against the burdens they impose and are invaluable tools for informing decision makers.We offer 10 tips for nonspecialist policymakers and interested stakeholders who will be reading RIAs as consumers. Core problem: Determine whether the RIA identifies the core problem (compelling public need) the regulation is intended to address. Alternatives: Look for an objective, policy-neutral evaluation of the relative merits of reasonable alternatives. Baseline: Check whether the RIA presents a reasonable âcounterfactualâ against which benefits and costs are measured. Increments: Evaluate whether totals and averages obscure relevant distinctions and trade-offs. Uncertainty: Recognize that all estimates involve uncertainty, and ask what effect key assumptions, data, and models have on those estimates. Transparency: Look for transparency and objectivity of analytical inputs. Benefits: Examine how projected benefits relate to stated objectives. Costs: Understand what costs are included. Distribution: Consider how benefits and costs are distributed. Symmetrical treatment: Ensure that benefits and costs are presented symmetrically
Milagro Observations of Multi-TeV Emission from Galactic Sources in the Fermi Bright Source List
We present the result of a search of the Milagro sky map for spatial
correlations with sources from a subset of the recent Fermi Bright Source List
(BSL). The BSL consists of the 205 most significant sources detected above 100
MeV by the Fermi Large Area Telescope. We select sources based on their
categorization in the BSL, taking all confirmed or possible Galactic sources in
the field of view of Milagro. Of the 34 Fermi sources selected, 14 are observed
by Milagro at a significance of 3 standard deviations or more. We conduct this
search with a new analysis which employs newly-optimized gamma-hadron
separation and utilizes the full 8-year Milagro dataset. Milagro is sensitive
to gamma rays with energy from 1 to 100 TeV with a peak sensitivity from 10-50
TeV depending on the source spectrum and declination. These results extend the
observation of these sources far above the Fermi energy band. With the new
analysis and additional data, multi-TeV emission is definitively observed
associated with the Fermi pulsar, J2229.0+6114, in the Boomerang Pulsar Wind
Nebula (PWN). Furthermore, an extended region of multi-TeV emission is
associated with the Fermi pulsar, J0634.0+1745, the Geminga pulsar.Comment: Accepted for publication in Astrophysical Journal Letters June 30,
200
Prognostic value of CT coronary angiography in diabetic and non-diabetic subjects with suspected CAD: importance of presenting symptoms
AIM:
To assess the prognostic relevance of 64-slice computed tomography coronary angiography (CT-CA) and symptoms in diabetics and non-diabetics referred for cardiac evaluation.
METHODS:
We followed 210 patients with diabetes type 2 (DM) and 203 non-diabetic patients referred for CT-CA for ruling out coronary artery disease (CAD). Patients were without known history of CAD and were divided into four categories on the basis of symptoms at presentation (none, atypical angina, typical angina and dyspnoea). Clinical end points were major cardiac events (MACE): cardiac-related death, non-fatal myocardial infarction, unstable angina and cardiac revascularizations. Cox proportional hazard models, with and without adjustment for risk factors and multiplicative interaction term (obstructive CAD
7 DM), were developed to predict outcome.
RESULTS:
DM patients with dyspnoea or who were asymptomatic showed a higher prevalence of obstructive CAD than non-diabetics (p\u2009 64\u20090.01). At mean follow-up of 20.4 months, DM patients had worse cardiac event-free survival in comparison with non-DM patients (90% vs. 81%, p\u2009=\u20090.02). In multivariate analysis, CT-CA evidence of obstructive CAD (in DM patients: HR: 6.4; 95% CI: 2.3-17.5; p\u2009100 in non-DM patients (HR: 5.6; 95% CI: 1.4-21.5; p\u2009=\u20090.01). In Cox regression analysis of the overall population, interaction term obstructive CAD
7 DM resulted in non-significance.
CONCLUSIONS:
Among DM patients, dyspnoea carried a high event risk with a MACE rate four times higher. CT-CA findings were strongly predictive of outcome and proved valuable for further risk stratification
Altitude Acclimatization Alleviates the Hypoxia-Induced Suppression of Exogenous Glucose Oxidation During Steady-State Aerobic Exercise
This study investigated how high-altitude (HA, 4300 m) acclimatization affected exogenous glucose oxidation during aerobic exercise. Sea-level (SL) residents (n = 14 men) performed 80-min, metabolically matched exercise (VËO2 ⌠1.7 L/min) at SL and at HA < 5 h after arrival (acute HA, AHA) and following 22-d of HA acclimatization (chronic HA, CHA). During HA acclimatization, participants sustained a controlled negative energy balance (-40%) to simulate the âreal worldâ conditions that lowlanders typically experience during HA sojourns. During exercise, participants consumed carbohydrate (CHO, n = 8, 65.25 g fructose + 79.75 g glucose, 1.8 g carbohydrate/min) or placebo (PLA, n = 6). Total carbohydrate oxidation was determined by indirect calorimetry and exogenous glucose oxidation by tracer technique with 13C. Participants lost (P †0.05, mean ± SD) 7.9 ± 1.9 kg body mass during the HA acclimatization and energy deficit period. In CHO, total exogenous glucose oxidized during the final 40 min of exercise was lower (P < 0.01) at AHA (7.4 ± 3.7 g) than SL (15.3 ± 2.2 g) and CHA (12.4 ± 2.3 g), but there were no differences between SL and CHA. Blood glucose and insulin increased (P †0.05) during the first 20 min of exercise in CHO, but not PLA. In CHO, glucose declined to pre-exercise concentrations as exercise continued at SL, but remained elevated (P †0.05) throughout exercise at AHA and CHA. Insulin increased during exercise in CHO, but the increase was greater (P †0.05) at AHA than at SL and CHA, which did not differ. Thus, while acute hypoxia suppressed exogenous glucose oxidation during steady-state aerobic exercise, that hypoxic suppression is alleviated following altitude acclimatization and concomitant negative energy balance
Qualitative Research on Work-Family in the Management Field: A Review
Despite a proliferation of work-family literature over the past three decades, studies employing quantitative methodologies significantly outweigh those adopting qualitative approaches. In this paper, we intend to explore the state of qualitative work-family research in the management field and provide a comprehensive profile of the 152 studies included in this review. We synthesize the findings of qualitative work-family studies and provide six themes including parenthood, gender differences, cultural differences, family-friendly policies and non-traditional work arrangements, coping strategies, and under-studied populations. We also describe how findings of qualitative work-family studies compare to that of quantitative studies. The review highlights seven conclusions in the current qualitative literature: a limited number of qualitative endeavours, findings worth further attention, convergent foci, the loose use of work-family terminology, the neglect of a variety of qualitative research approaches, quantitative attitudes towards qualitative research, and insufficient reporting of research methods. In addition, implications for future researchers are discussed
- âŠ