6,157 research outputs found
Bruno zevi, the continental European emissary of geoffrey scott’s theories
Peer ReviewedPostprint (author's final draft
The inheritance of dynamic and deontic integrity constraints or: Does the boss have more rights?
In [18,23], we presented a language for the specification of static, dynamic and deontic integrity constraints (IC's) for conceptual models (CM's). An important problem not discussed in that paper is how IC's are inherited in a taxonomic network of types. For example, if students are permitted to perform certain actions under certain preconditions, must we repeat these preconditions when specializing this action for the subtype of graduate students, or are they inherited, and if so, how? For static constraints, this problem is relatively trivial, but for dynamic and deontic constraints, it will turn out that it contains numerous pitfalls, caused by the fact that common sense supplies presuppositions about the structure of IC inheritance that are not warranted by logic. In this paper, we unravel some of these presuppositions and show how to avoid the pitfalls. We first formulate a number of general theorems about the inheritance of necessary and/or sufficient conditions and show that for upward inheritance, a closure assumption is needed. We apply this to dynamic and deontic IC's, where conditions arepreconditions of actions, and show that our common sense is sometimes mistaken about the logical implications of what we have specified. We also show the connection of necessary and sufficient preconditions of actions with the specification of weakest preconditions in programming logic. Finally, we argue that information analysts usually assume constraint completion in the specification of (pre)conditions analogous to predicate completion in Prolog and circumscription in non-monotonic logic. The results are illustrated with numerous examples and compared with other approaches in the literature
A Critical Review of "Automatic Patch Generation Learned from Human-Written Patches": Essay on the Problem Statement and the Evaluation of Automatic Software Repair
At ICSE'2013, there was the first session ever dedicated to automatic program
repair. In this session, Kim et al. presented PAR, a novel template-based
approach for fixing Java bugs. We strongly disagree with key points of this
paper. Our critical review has two goals. First, we aim at explaining why we
disagree with Kim and colleagues and why the reasons behind this disagreement
are important for research on automatic software repair in general. Second, we
aim at contributing to the field with a clarification of the essential ideas
behind automatic software repair. In particular we discuss the main evaluation
criteria of automatic software repair: understandability, correctness and
completeness. We show that depending on how one sets up the repair scenario,
the evaluation goals may be contradictory. Eventually, we discuss the nature of
fix acceptability and its relation to the notion of software correctness.Comment: ICSE 2014, India (2014
Your {JSON} is not my {JSON} : a case for more fine-grained content negotiation
Information resources can be expressed in different representations along many dimensions such as format, language, and time. Through content negotiation, http clients and servers can agree on which representation is most appropriate for a given piece of data. For instance, interactive clients typically indicate they prefer HTML, whereas automated clients would ask for JSON or RDF. However, labels such as “JSON” and “RDF” are insufficient to negotiate between the rich variety of possibilities offered by today’s languages and data models. This position paper argues that, despite widespread misuse, content negotiation remains the way forward. However, we need to extend it with more granular options in order to serve different current and future Web clients sustainably
High-Fidelity Spectroscopy at the Highest Resolutions
High-fidelity spectroscopy presents challenges for both observations and in
designing instruments. High-resolution and high-accuracy spectra are required
for verifying hydrodynamic stellar atmospheres and for resolving intergalactic
absorption-line structures in quasars. Even with great photon fluxes from large
telescopes with matching spectrometers, precise measurements of line profiles
and wavelength positions encounter various physical, observational, and
instrumental limits. The analysis may be limited by astrophysical and telluric
blends, lack of suitable lines, imprecise laboratory wavelengths, or
instrumental imperfections. To some extent, such limits can be pushed by
forming averages over many similar spectral lines, thus averaging away small
random blends and wavelength errors. In situations where theoretical
predictions of lineshapes and shifts can be accurately made (e.g., hydrodynamic
models of solar-type stars), the consistency between noisy observations and
theoretical predictions may be verified; however this is not feasible for,
e.g., the complex of intergalactic metal lines in spectra of distant quasars,
where the primary data must come from observations. To more fully resolve
lineshapes and interpret wavelength shifts in stars and quasars alike, spectral
resolutions on order R=300,000 or more are required; a level that is becoming
(but is not yet) available. A grand challenge remains to design efficient
spectrometers with resolutions approaching R=1,000,000 for the forthcoming
generation of extremely large telescopes.Comment: 6 pages, 4 figures, to appear in Reviews in Modern Astronomy vol. 22
(2010
Fight or Flight? Defending Against Sequential Attacks in the Game of Siege
This paper examines theory and behavior in a two-player game of siege, sequential attack and defense. The attacker’s objective is to successfully win at least one battle while the defender’s objective is to win every battle. Theoretically, the defender either folds immediately or, if his valuation is sufficiently high and the number of battles is sufficiently small, then he has a constant incentive to fight in each battle. Attackers respond to defense with diminishing assaults over time. Consistent with theoretical predictions, our experimental results indicate that the probability of successful defense increases in the defenders valuation and it decreases in the overall number of battles in the contest. However, the defender engages in the contest significantly more often than predicted and the aggregate expenditures by both parties exceed predicted levels. Moreover, both defenders and attackers actually increase the intensity of the fight as they approach the end of the contest.Colonel Blotto, conflict resolution, weakest-link, game of siege, multi-period resource allocation, experiments.
Avoiding the Common Wisdom Fallacy: The Role of Social Sciences in Constitutional Adjudication
More than one hundred years ago, the U.S. Supreme Court started to refer to social science evidence in its judgments. However, this has not resonated with many constitutional courts outside the United States, in particular in continental Europe. This contribution has a twofold aim. First, it tries to show that legal reasoning in constitutional law is often based on empirical assumptions so that there is a strong need for the use of social sciences. However, constitutional courts often lack the necessary expertise to deal with empirical questions. Therefore, I will discuss three potential strategies to make use of social science evidence. Judges can interpret social facts on their own, they can afford a margin of appreciation to the legislator, or they can defer the question to social science experts. It will be argued that none of these strategies is satisfactory so that courts will have to employ a combination of different strategies. In order to illustrate the argument, I will discuss decisions of different jurisdictions, including the United States, Canada, Germany and South Africa.proportionality, comparative law, Germany, Uncertainty, margin of appreciation, constitutional law, Canada, South Africa, social sciences, empiricism
Recommended from our members
Challenges and Opportunities to Updating Prescribing Information for Longstanding Oncology Drugs.
A number of important drugs used to treat cancer-many of which serve as the backbone of modern chemotherapy regimens-have outdated prescribing information in their drug labeling. The Food and Drug Administration is undertaking a pilot project to develop a process and criteria for updating prescribing information for longstanding oncology drugs, based on the breadth of knowledge the cancer community has accumulated with the use of these drugs over time. This article highlights a number of considerations for labeling updates, including selecting priorities for updating; data sources and evidentiary criteria; as well as the risks, challenges, and opportunities for iterative review to ensure prescribing information for oncology drugs remains relevant to current clinical practice
- …