24 research outputs found
Improvements for Free
"Theorems for Free!" (Wadler, FPCA 1989) is a slogan for a technique that
allows to derive statements about functions just from their types. So far, the
statements considered have always had a purely extensional flavor: statements
relating the value semantics of program expressions, but not statements
relating their runtime (or other) cost. Here we study an extension of the
technique that allows precisely statements of the latter flavor, by deriving
quantitative theorems for free. After developing the theory, we walk through a
number of example derivations. Probably none of the statements derived in those
simple examples will be particularly surprising to most readers, but what is
maybe surprising, and at the very least novel, is that there is a general
technique for obtaining such results on a quantitative level in a principled
way. Moreover, there is good potential to bring that technique to bear on more
complex examples as well. We turn our attention to short-cut fusion (Gill et
al., FPCA 1993) in particular.Comment: In Proceedings QAPL 2011, arXiv:1107.074
Deployment Prior Injection for Run-time Calibratable Object Detection
With a strong alignment between the training and test distributions, object
relation as a context prior facilitates object detection. Yet, it turns into a
harmful but inevitable training set bias upon test distributions that shift
differently across space and time. Nevertheless, the existing detectors cannot
incorporate deployment context prior during the test phase without parameter
update. Such kind of capability requires the model to explicitly learn
disentangled representations with respect to context prior. To achieve this, we
introduce an additional graph input to the detector, where the graph represents
the deployment context prior, and its edge values represent object relations.
Then, the detector behavior is trained to bound to the graph with a modified
training objective. As a result, during the test phase, any suitable deployment
context prior can be injected into the detector via graph edits, hence
calibrating, or "re-biasing" the detector towards the given prior at run-time
without parameter update. Even if the deployment prior is unknown, the detector
can self-calibrate using deployment prior approximated using its own
predictions. Comprehensive experimental results on the COCO dataset, as well as
cross-dataset testing on the Objects365 dataset, demonstrate the effectiveness
of the run-time calibratable detector
Greater Manchester green deal communities programme scheme exit paper
The report provides GM Local Authorities with an update on the Greater Manchester Green Deal Communities Programme and with relevant reference details for post programme
Copyleft vs copyright: some competitive effects of Open Source
In this paper, we study oligopolistic competition between closed and
open source softwares. By intersecting existing economic contributions on open source,
we propose a two stage game with perfect information and product differetiation in
which producers firstly set software's quality, then they determine prices (constrained
at zero for open source programs). In doing this, we explicitly model lock-in effects,
network externality components' of software quality as well as knowledge accumulation
in software use and implementation.
With respect to a monopolistic benchmark case, we argue that in duopoly a pro-
prietary sofware producer facing an open source software will reduce its selling price
whether: (i) its network of users is larger than open source's one and its consumers are
largely experienced on its program, (ii) it has a small network of un-skilled consumers.
In opposition, after open source software's emergence, proprietary software price does
augment if proprietary software users form a large, but poorly skilled network. Fur-
thermore, we show that, in all above cases, proprietary software quality increases
because of the existence of a open source alternative to a previouisly monopolistic
program.
Finally, by modeling knowledge accumulation processes through difference equa-
tions, we show that the ratio between closed and open source programs' opportunity
costs of software learning and deployment plays a crucial role in shaping market out-
comes. Until an open source software remains too complex and technical for unskilled
or time-scarse users, a shared market solution in which both softwares are adopted is
predicted. In contrast, if opportunity costs in learning and understanding open source
programs are remarkably low, or at least equal to opportunity costs of a closed source
software, then a open source dominance outcome (i.e. all software are open ones)
phases out
Programs for cheap!
Write down the definition of a recursion operator on a piece of paper. Tell me its type, but be careful not to let me see the operatorās definition. I will tell you an optimization theorem that the operator satisfies. As an added bonus, I will also give you a proof of correctness for the optimisation, along with a formal guarantee about its effect on performance. The purpose of this paper is to explain these tricks
Parametric polymorphism and operational improvement
Parametricity, in both operational and denotational forms, has long been a useful tool for reasoning about program correctness. However, there is as yet no comparable technique for reasoning about program improvement, that is, when one program uses fewer resources than another. Existing theories of parametricity cannot be used to address this problem as they are agnostic with regard to resource usage. This article addresses this problem by presenting a new operational theory of parametricity that is sensitive to time costs, which can be used to reason about time improvement properties. We demonstrate the applicability of our theory by showing how it can be used to prove that a number of well-known program fusion techniques are time improvements, including fixed point fusion, map fusion and short cut fusion
āDirecting the Upcoming Generationās Mind in the Right Directionā: Enslaved Children in the French Emancipation Project in Martinique, 1835-1848
Enslaved children were at the centre of Franceās reform of colonial slavery from the mid-1830s to 1848. Youths, because of their malleability and supposed innocence, were deemed most deserving of moralizing efforts through religious and elementary education. When emancipation was conceivable by many on both sides of the Atlantic, enslaved children came to represent the future generation of French colonial citizens. Although slaveholders strongly opposed abolitionistsā efforts and despite the slow nature of metropolitan reforms, hundreds of enslaved children in Martinique benefitted from the rights they acquired in the last decade and a half of slavery, namely, partial access to religious and elementary education, family rights, and freedom.Les enfants esclaves furent au cÅur de la reĢforme de lāesclavage dans les colonies meneĢe par la France du milieu des anneĢes 1830 aĢ 1848. On estimait que les jeunes, vu leur malleĢabiliteĢ et leur supposeĢe innocence, meĢritaient le plus dāeĢtre moraliseĢs graĢce aĢ lāenseignement religieux et aĢ lāenseignement primaire. Une fois lāeĢmancipation devenue envisageable par nombre de personnes de part et dāautre de lāAtlantique, on en vint aĢ consideĢrer les enfants esclaves comme la future geĢneĢration de citoyens francĢ§ais des colonies. Bien que les proprieĢtaires dāesclaves aient eĢteĢ fortement opposeĢs aux efforts des abolitionnistes et malgreĢ la lenteur des reĢformes meĢtropolitaines, des centaines dāenfants esclaves de Martinique beĢneĢficieĢrent des droits acquis pendant les quinze dernieĢres anneĢes de lāesclavage, soit lāacceĢs partiel aĢ lāenseignement religieux et aĢ lāenseignement primaire, les droits des familles et la liberteĢ
The Geologic Strata of the Law School Curriculum
The modest aim of this piece is to supply some historical background to the other contributions to this Symposium. The modern American law school curriculum is the product of a few but critical choices of design, some of them over a century old. In this Article, I seek to (1) outline how the basic structure and content of the modern American law school curriculum came into being and what were the main competitors that curriculum displaced; (2) describe some of the ways in which the curriculum\u27s basic structure and content have changed since its inception; and (3) point to some of the main sources and motors of change
The Geologic Strata of the Law School Curriculum
The modest aim of this piece is to supply some historical background to the other contributions to this Symposium. The modern American law school curriculum is the product of a few but critical choices of design, some of them over a century old. In this Article, I seek to (1) outline how the basic structure and content of the modern American law school curriculum came into being and what were the main competitors that curriculum displaced; (2) describe some of the ways in which the curriculum\u27s basic structure and content have changed since its inception; and (3) point to some of the main sources and motors of change.
ORIGINS OF THE BASIC STRUCTURE AND CONTENT OF THE MODERN AMERICAN LAW SCHOOL CURRICULUM
The American law school-in the basic shape we recognize it today-originated with the model of legal education that President Charles W. Eliot and Dean Christopher Columbus Langdell established at Harvard in 1870. At first, Harvard\u27s model seemed as if it might fail: the school had lost enrollments and had to make its way against rival models and hostile critics. By 1900, however, Harvard\u27s success was assured. Its example-sometimes transmitted by Harvard\u27s own former faculty as pro-consular deans, such as William Keener at Columbia and Joseph Beale at Chicago-spread to other leading law schools between 1895 and 1925; and between 1925 and 1950 virtually every full-time university-based law school in the country had adopted the Harvard model\u27s basic elements. This story of convergence on a uniform model is all the more remarkable when one considers that all of these law schools were preparing their students for a wide variety of roles and careers in the United States\u27 highly stratified and segmented legal profession. A. The Bedrock: The Harvard Template and Its Rivals In the period of its adoption, Harvard\u27s model was distinctive in both structure and content