3,070 research outputs found
FARM-LEVEL EVALUATION OF ALTERNATIVE POLICY APPROACHES TO REDUCE NITRATE LEACHING FROM MIDWEST AGRICULTURE
Policies to reduce nitrate leaching are evaluated using a mixed integer linear programming model of a representative Michigan cash grain farm. At spring 1993 prices, elimination of the current deficiency payment program is found to be more efficient at reducing leaching than a nitrogen input tax, a tax credit on biologically fixed nitrogen, a rotation payment, or obligatory use of the Integrated Farm Management Program Option (IFMPO). However, elimination of the deficiency payment program would significantly reduce farm income. Modeling risk management and nitrate leaching dynamics are useful extensions of this research, as is estimating the benefits from averting nitrate leaching.Agricultural and Food Policy,
Evaluating the Long-run Impacts of the 9/11 Terrorist Attacks on US Domestic Airline Travel
Although the US airline industry began 2001 with 24 consecutive profitable quarters, including net profits in 2000 totaling $7.9 billion, the impact of the 9/11 event on the industry was substantial. Whereas the recession that began in early 2001 signaled the end of profitability, the 9/11 terrorist attacks pushed the industry into financial crisis after air travel dropped 20% over the September–December 2001 period compared to the same period in 2000. Given the decline in domestic air travel, an important question is whether the detrimental impact of the attacks was temporary or permanent. That is, did airline travel return to the trend that existed prior to the terrorist attacks? There are theoretical reasons to the believe that it would not. Economists have long viewed travel-mode choices as the outcome of a comparison of opportunity costs and benefits. Thus, anything that permanently raises the opportunity cost of travel, holding benefits constant, should reduce the level of travel volume. To determine whether air travel was permanently reduced, we use econometric and time-series forecasting models to generate a counter-factual forecast of air travel volume in the absence of the terrorist attacks. These dynamic forecasts are compared to actual air travel levels to determine the impact of the terrorist attacks. The findings suggest that domestic air travel did not return to the levels that would have existed in the absence of the attack
Fixing the functoriality of Khovanov homology
We describe a modification of Khovanov homology (math.QA/9908171), in the
spirit of Bar-Natan (math.GT/0410495), which makes the theory properly
functorial with respect to link cobordisms.
This requires introducing `disorientations' in the category of smoothings and
abstract cobordisms between them used in Bar-Natan's definition.
Disorientations have `seams' separating oppositely oriented regions, coming
with a preferred normal direction. The seams satisfy certain relations (just as
the underlying cobordisms satisfy relations such as the neck cutting relation).
We construct explicit chain maps for the various Reidemeister moves, then
prove that the compositions of chain maps associated to each side of each of
Carter and Saito's movie moves (MR1238875, MR1445361) always agree. These
calculations are greatly simplified by following arguments due to Bar-Natan and
Khovanov, which ensure that the two compositions must agree, up to a sign. We
set up this argument in our context by proving a result about duality in
Khovanov homology, generalising previous results about mirror images of knots
to a `local' result about tangles. Along the way, we reproduce Jacobsson's sign
table (math.GT/0206303) for the original `unoriented theory', with a few
disagreements.Comment: 91 pages. Added David Clark as co-author. Further detail on
variations of third Reidemeister moves, to allow treatment of previously
missing cases of movie move six. See changelog section for more detai
Supporting Real-Time Applications in an Integrated Services Packet Network: Architecture and Mechanism
This paper considers the support of real-time applications in an Integrated Services Packet Network (ISPN). We first review the characteristics of real-time applications. We observe that, contrary to the popular view that real-time applications necessarily require a fixed delay bound, some real-time applications are more
flexible and can adapt to current network conditions. We then propose an ISPN architecture that supports two distinct kinds of real-time service: guaranteed service, which is the traditional form of real-time service discussed in most of the literature and involves pre-computed worst-case delay bounds, and predicted service,
which uses the measured performance of the network in computing delay bounds. We then propose a packet scheduling mechanism that can support both of these real-time services as well as accommodate datagram traffic. We also discuss two other aspects of an overall ISPN architecture: the service interface and the admission control criteria.Research at MIT was supported by DARPA through NASA Grant
NAG 2-582, by NSF grant NCR-8814187, and by DARPA and NSF
through Cooperative Agreement NCR-8919038 with the Corporation
for National Research Initiatives
Our interests and Christ : the Christian existentialism of Helmut Rex
This thesis assesses both the historical and ongoing contribution to theology in New Zealand of Helmut Rex. Born in 1913 and educated in Berlin, Rex joined the Confessing Church as a trainee pastor and soon came into conflict with the Nazis. After arriving in New Zealand as a refugee he taught courses in biblical languages, church history, literature and hermeneutics. Rex initially taught at Knox Theological College and later at the nearby University of Otago. In 1948 Rex secured an M.A. in philosophy at the University of Otago and in 1954 completed a doctorate in Pauline eschatology and ethics at Tubingen. His competence as a scholar and teacher was recognised with the establishment in 1953 of a professorial chair in church history for his benefit. Although plagued by ill-health, Rex lectured until he took early retirement in 1963.
This study focuses predominantly on Rex's public lectures and publications and consequently leaves the majority of Rex's work in the field of biblical studies untouched. Five key characteristics, or leitmotifs, are uncovered in Rex's work. First, Rex places an emphasis on the need for ongoing reformation of the Christian faith. Second, he is concerned to be clear about just what we are actually able to assert about 'God and man'. Rex does this in order to move from a sure-footing towards those matters about which we would like to know. These are the subject of the third leitmotif in Rex's thought - 'matters of ultimate concern'. Fourth, Rex draws from a wide range of academic disciplines in order to construct a theology for 'modern man'. This approach has been called 'bricolage'. Finally, Rex's incisive analysis and quirky narrative style are seen to generate empathy in an audience in a manner that jars with his unexpected conclusions. This fifth characteristic, labelled 'cognitive dissonance', effects in an audience a re-appraisal of previously held convictions.
Towards the end of Rex's teaching career he asserted that the theologian should have dual foci - 'Christ and our interests'. It is observed in this thesis that the Christian scriptures inform Rex's understanding of the 'Christ', and existentialist philosophy his understanding of 'our interests'. It is argued that the approach might better be labelled 'our interests and Christ' because existentialist anthropology is both its starting and its strong point.
Investigation into Rex's papers reveals the benefit 'an sich' of study into the history of ideas. As Rex suggests, we study history primarily in order to understand ourselves. Rex's theology in particular is a record of an attempt to come to terms with the Christian faith in a secular environment. Rex was unusually sensitive to the ebb and flow of historical tides. Despite living in a period of unparalleled Church growth Rex maintained that the church was becoming less and less relevant. He was convinced that the Church needed to reform itself drastically. Rex was also responsive to new philosophical and theological movements. He was influential in the spread of existentialist ideas and gave early and considered assessment of the 'death of God' theologians.
Although his concrete contributions were largely 'of their time', Rex's grapplings with new philosophical and theological trends in a time of increasing religious uncertainty offer direction for those wishing to express theological truth in today's environment. Towards the end of his teaching career Rex wrote papers addressing particular societal issues. His papers on race relations, homosexuality and alcoholism are illustrative of the way in which a sound theological framework coupled with a keen and sympathetic ear can shed new light on existing issues. Rex's most important impact was on his students and those who assembled to hear his public addresses. His embodiment of dedication to learning and earnest pursuit of the truth generated an impression few who encountered him would forget. Using a term coined by one of Rex's former pupils, this impact, as it is explored in this thesis, has been labelled 'the scandal of particularity'
From Sensors to Knowledge: The Challenge of Training the Next Generation of Data Analysts
With the advent of commercial-off-the-shelf sensors for use in a variety of applications, integration with analytical software tools, and expansion of available archived datasets, there is a critical need to address the problem of transforming resultant data into comprehensible, actionable information for decision-makers through rigorous analysis. In previous research the participating authors have emphasized that users are often faced with the situation in which they are “drowning in a sea of data” but still “thirsting for knowledge”. The availability of analysis software, tools, and techniques provide opportunities for information collection of ever increasing complexity, but the need for the training of analysts to employ appropriate tools and processes to ensure accurate and applicable results has not been addressed. The purpose of this paper is to discuss the challenges and opportunities facing the training of effective analysts capable of handling a wide-range of data types in this era of dynamic tools and techniques
Recommended from our members
Negation’s Not Solved: Generalizability Versus Optimizability in Clinical Natural Language Processing
A review of published work in clinical natural language processing (NLP) may suggest that the negation detection task has been “solved.” This work proposes that an optimizable solution does not equal a generalizable solution. We introduce a new machine learning-based Polarity Module for detecting negation in clinical text, and extensively compare its performance across domains. Using four manually annotated corpora of clinical text, we show that negation detection performance suffers when there is no in-domain development (for manual methods) or training data (for machine learning-based methods). Various factors (e.g., annotation guidelines, named entity characteristics, the amount of data, and lexical and syntactic context) play a role in making generalizability difficult, but none completely explains the phenomenon. Furthermore, generalizability remains challenging because it is unclear whether to use a single source for accurate data, combine all sources into a single model, or apply domain adaptation methods. The most reliable means to improve negation detection is to manually annotate in-domain training data (or, perhaps, manually modify rules); this is a strategy for optimizing performance, rather than generalizing it. These results suggest a direction for future work in domain-adaptive and task-adaptive methods for clinical NLP
- …