10,454 research outputs found
The Track Record on Takings Legislation: Lessons from Democracy\u27s Laboratories
This report by the Georgetown Environmental Law & Policy Institute, entitled The Track Record on Takings Legislation: Lessons from Democracy\u27s Laboratories, examines the experiences of Florida, Oregon, and several other states with legislation implementing the property rights agenda. The report is the first comprehensive effort to systematically identify and evaluate the on-the-ground consequences of so-called takings compensation laws. The major findings of the report are that the takings agenda has undermined community protections by forcing a roll back of existing legal rules and/or by exerting a chilling effect on new legislative activity, special interests such as developers and timber companies have been the primary beneficiaries of takings legislation, the takings laws have fomented and exacerbated neighbor-neighbor conflicts over land use issues, the takings agenda has conferred large windfalls on certain owners either in the form of taxpayer-funded awards or special exemptions from the rules that apply to the rest of the community, and the property rights agenda has undermined the democratic process. Contrary to a common argument made by proponents of this type of legislation, requiring the government to pay to regulate does not lead government officials to make a more nuanced appraisal of the costs and benefits of regulations, apparently because the salience of fiscal costs to government officials far outweighs the relatively more diffuse political benefits of community and homeowner protection
L’explication interthéorique en relations internationales : quelques jalons pour une synthèse du réalisme structurel américain et de la géopolitique française contemporaine.
Structural realism is currently at the center of international political theory in the United States. Sociological interpretations of ethnocultural or linguistic hegemony aside, this scientifically rigorous theorizing can stand on its intrinsic merits and is destined to exercise a major influence on future efforts to construct explanatory models of international political relations. This article sets out why that is so by drawing a profile of a viable deductive macrotheory of Interstate politics. The new realist theory is distinguished from its more overtly normative and prescriptivist antecedents which sought to come to terms with the contending claims of power and ethics in world politics and from the self-conscious scientism of earlier Systems thinking which emphasized unit-processed interaction patterns. Structural realism has broken free from the holistic organicism of Systems theory, tributary to biological models, to align the theory-building enterprise with the more successful formal structuralism of the physico-chemical sciences which places a premium on the generic description of logico-mathematical group structures arrived at through the inventive deduction or axiomatic decision of their constituent unit s. The exemplar text of American structural realism posits a form of what Piaget called 'relational' structuralism predicated on distributions of power resources among the international System's unit s. This focus on internal or necessary asymmetric relations between and among polarizing and dependent units renders structural realism a choice object for synthesis with inductively generated geopolitical constructs which stress microstructural configurations of relative capabilities. The current wave of geopolitical writing in the French language is drawn on to demonstrate how the procedures of intertheoretical reduction can be employed to enrich structural realism's explanation of system-level constraints on state action via the introduction of a spatiotemporal component
Brown, W. Norman, The United States and India, Pakistan, Bangladesh. Harvard University Press, (The American Foreign Policy Library), Cambridge, Mass., 1972, 462 p.
TRADE REMEDY ACTIONS IN NAFTA AGRICULTURE AND AGRI-FOOD INDUSTRIES
International Relations/Trade,
Identifying features predictive of faculty integrating computation into physics courses
Computation is a central aspect of 21st century physics practice; it is used
to model complicated systems, to simulate impossible experiments, and to
analyze mountains of data. Physics departments and their faculty are
increasingly recognizing the importance of teaching computation to their
students. We recently completed a national survey of faculty in physics
departments to understand the state of computational instruction and the
factors that underlie that instruction. The data collected from the faculty
responding to the survey included a variety of scales, binary questions, and
numerical responses. We then used Random Forest, a supervised learning
technique, to explore the factors that are most predictive of whether a faculty
member decides to include computation in their physics courses. We find that
experience using computation with students in their research, or lack thereof
and various personal beliefs to be most predictive of a faculty member having
experience teaching computation. Interestingly, we find demographic and
departmental factors to be less useful factors in our model. The results of
this study inform future efforts to promote greater integration of computation
into the physics curriculum as well as comment on the current state of
computational instruction across the United States
Quantifying methane and nitrous oxide emissions from the UK and Ireland using a national-scale monitoring network
The UK is one of several countries around the world that has enacted legislation to reduce its greenhouse gas emissions. In this study, we present top-down emissions of methane (CH4) and nitrous oxide (N2O) for the UK and Ireland over the period August 2012 to August 2014. These emissions were inferred using measurements from a network of four sites around the two countries. We used a hierarchical Bayesian inverse framework to infer fluxes as well as a set of covariance parameters that describe uncertainties in the system. We inferred average UK total emissions of 2.09 (1.65–2.67) Tg yr−1 CH4 and 0.101 (0.068–0.150) Tg yr−1 N2O and found our derived UK estimates to be generally lower than the a priori emissions, which consisted primarily of anthropogenic sources and with a smaller contribution from natural sources. We used sectoral distributions from the UK National Atmospheric Emissions Inventory (NAEI) to determine whether these discrepancies can be attributed to specific source sectors. Because of the distinct distributions of the two dominant CH4 emissions sectors in the UK, agriculture and waste, we found that the inventory may be overestimated in agricultural CH4 emissions. We found that annual mean N2O emissions were consistent with both the prior and the anthropogenic inventory but we derived a significant seasonal cycle in emissions. This seasonality is likely due to seasonality in fertilizer application and in environmental drivers such as temperature and rainfall, which are not reflected in the annual resolution inventory. Through the hierarchical Bayesian inverse framework, we quantified uncertainty covariance parameters and emphasized their importance for high-resolution emissions estimation. We inferred average model errors of approximately 20 and 0.4 ppb and correlation timescales of 1.0 (0.72–1.43) and 2.6 (1.9–20 3.9) days for CH4 and N2O, respectively. These errors are a combination of transport model errors as well as errors due to unresolved emissions processes in the inventory. We found the largest CH4 errors at the Tacolneston station in eastern England, which may be due to sporadic emissions from landfills and offshore gas in the North Sea
Recommended from our members
Hidden Structure: Using Network Methods to Map Product Architecture
In this paper, we describe an operational methodology for characterizing the architecture of complex technical systems and demonstrate its application to a large sample of software releases. Our methodology is based upon directed network graphs, which allows us to identify all of the direct and indirect linkages between the components in a system. We use this approach to define three fundamental architectural patterns, which we label Core—periphery, multi-core, and hierarchical. Applying our methodology to a sample of 1,286 software releases from 17 applications, we find that the majority of releases possess a "core-periphery" structure. This architecture is characterized by a single dominant cyclic group of components (the "Core") that is large relative to the system as a whole as well as to other cyclic groups in the system. We show that the size of the Core varies widely, even for systems that perform the same function. These differences appear to be associated with different models of development—open, distributed organizations develop systems with smaller Cores, while closed, co-located organizations develop systems with larger Cores. Our findings establish some "stylized facts" about the fine-grained structure of large, real-world technical systems, serving as a point of departure for future empirical work
- …