9,721 research outputs found
A Calculus of Bounded Capacities
Resource control has attracted increasing interest in foundational research on distributed systems. This paper focuses on space control and develops an analysis of space usage in the context of an ambient-like calculus with bounded capacities and weighed processes, where migration and activation require space. A type system complements the dynamics of the calculus by providing static guarantees that the intended capacity bounds are preserved throughout the computation
Enhanced atomic layer etching of native aluminum oxide for ultraviolet optical applications
We report on the development and application of an atomic layer etching (ALE)
procedure based on alternating exposures of trimethylaluminum and anhydrous
hydrogen fluoride (HF) implemented to controllably etch aluminum oxide. Our ALE
process utilizes the same chemistry previously demonstrated in the atomic layer
deposition of aluminum fluoride thin films, and can therefore be exploited to
remove the surface oxide from metallic aluminum and replace it with thin
fluoride layers in order to improve the performance of ultraviolet aluminum
mirrors. This ALE process is modified relative to existing methods through the
use of a chamber conditioning film of lithium fluoride, which is shown to
enhance the loss of fluorine surface species and results in conformal
layer-by-layer etching of aluminum oxide films. Etch properties were explored
over a temperature range of 225 to 300 {\deg}C with the Al2O3 etch rate
increasing from 0.8 to 1.2 {\AA} per ALE cycle at a fixed HF exposure of 60 ms
per cycle. The effective etch rate has a dependence on the total HF exposure,
but the process is shown to be scalable to large area substrates with a
post-etch uniformity of better than 2% demonstrated on 125 mm diameter wafers.
The efficacy of the ALE process in reducing interfacial native aluminum oxide
on evaporated aluminum mirrors is demonstrated with characterization by x-ray
photoelectron spectroscopy and measurements of ultraviolet reflectance at
wavelengths down to 120 nm
Linux kernel compaction through cold code swapping
There is a growing trend to use general-purpose operating systems like Linux in embedded systems. Previous research focused on using compaction and specialization techniques to adapt a general-purpose OS to the memory-constrained environment, presented by most, embedded systems. However, there is still room for improvement: it has been shown that even after application of the aforementioned techniques more than 50% of the kernel code remains unexecuted under normal system operation. We introduce a new technique that reduces the Linux kernel code memory footprint, through on-demand code loading of infrequently executed code, for systems that support virtual memory. In this paper, we describe our general approach, and we study code placement algorithms to minimize the performance impact of the code loading. A code, size reduction of 68% is achieved, with a 2.2% execution speedup of the system-mode execution time, for a case study based on the MediaBench II benchmark suite
A review of High Performance Computing foundations for scientists
The increase of existing computational capabilities has made simulation
emerge as a third discipline of Science, lying midway between experimental and
purely theoretical branches [1, 2]. Simulation enables the evaluation of
quantities which otherwise would not be accessible, helps to improve
experiments and provides new insights on systems which are analysed [3-6].
Knowing the fundamentals of computation can be very useful for scientists, for
it can help them to improve the performance of their theoretical models and
simulations. This review includes some technical essentials that can be useful
to this end, and it is devised as a complement for researchers whose education
is focused on scientific issues and not on technological respects. In this
document we attempt to discuss the fundamentals of High Performance Computing
(HPC) [7] in a way which is easy to understand without much previous
background. We sketch the way standard computers and supercomputers work, as
well as discuss distributed computing and discuss essential aspects to take
into account when running scientific calculations in computers.Comment: 33 page
Research methodology of grazing
Throughout Europe, grass is the main feed for dairy cattle. This report presents the main results of the first meeting of the European Grassland Federation (EGF) Working Group Grazing in Kiel on 29 August 2010. The theme of the meeting was "Research methodology of grazing". There were three sessions: - setting the scene; - modelling of grazing; and - field measurements
Critical behavior at Mott-Anderson transition: a TMT-DMFT perspective
We present a detailed analysis of the critical behavior close to the
Mott-Anderson transition. Our findings are based on a combination of numerical
and analytical results obtained within the framework of Typical-Medium Theory
(TMT-DMFT) - the simplest extension of dynamical mean field theory (DMFT)
capable of incorporating Anderson localization effects. By making use of
previous scaling studies of Anderson impurity models close to the
metal-insulator transition, we solve this problem analytically and reveal the
dependence of the critical behavior on the particle-hole symmetry. Our main
result is that, for sufficiently strong disorder, the Mott-Anderson transition
is characterized by a precisely defined two-fluid behavior, in which only a
fraction of the electrons undergo a "site selective" Mott localization; the
rest become Anderson-localized quasiparticles.Comment: 4+ pages, 4 figures, v2: minor changes, accepted for publication in
Phys. Rev. Let
The role of the Intermediate Care Team in detecting and responding to loneliness in older clients
The Intermediate Care Team (ICT) supports patients in their own homes to manage complex needs. They are ideally placed in the community to identify older adults at risk of loneliness. However, little is known about how ICT professionals perceive, detect, or respond to loneliness in their clients. This study explores ICT professional’s attitudes to loneliness in the context of perceived service priorities and their experiences of managing loneliness in their clients. Eight ICT professionals (n=2 physiotherapists, n=3 occupational therapists, n=3 nurses) took part in semi-structured interviews. Data was analysed thematically using framework analysis, applying the Theory of Planned Behaviour as an interpretive framework. ICT professionals believed loneliness was a significant issue for many of their older clients but was a low priority for ICT services. Study participants believed that loneliness often goes undetected because it is an issue that is difficult to measure objectively. Barriers to managing loneliness included high work-load, unsatisfactory referral systems, and lack of close working with social-care and independent sector services. Introducing brief but reliable loneliness assessments into routine practice, receiving training on detecting and managing loneliness, and improving working relationships with social care and independent sector services were highlighted as strategies that could improve the detection and management of loneliness in ICT clients
Self-tuned quantum dot gain in photonic crystal lasers
We demonstrate that very few (1 to 3) quantum dots as a gain medium are
sufficient to realize a photonic crystal laser based on a high-quality
nanocavity. Photon correlation measurements show a transition from a thermal to
a coherent light state proving that lasing action occurs at ultra-low
thresholds. Observation of lasing is unexpected since the cavity mode is in
general not resonant with the discrete quantum dot states and emission at those
frequencies is suppressed. In this situation, the quasi-continuous quantum dot
states become crucial since they provide an energy-transfer channel into the
lasing mode, effectively leading to a self-tuned resonance for the gain medium.Comment: 4 pages, 4 figures, submitted to Phys. Re
Hennessy-Milner Logic with Greatest Fixed Points as a Complete Behavioural Specification Theory
There are two fundamentally different approaches to specifying and verifying
properties of systems. The logical approach makes use of specifications given
as formulae of temporal or modal logics and relies on efficient model checking
algorithms; the behavioural approach exploits various equivalence or refinement
checking methods, provided the specifications are given in the same formalism
as implementations.
In this paper we provide translations between the logical formalism of
Hennessy-Milner logic with greatest fixed points and the behavioural formalism
of disjunctive modal transition systems. We also introduce a new operation of
quotient for the above equivalent formalisms, which is adjoint to structural
composition and allows synthesis of missing specifications from partial
implementations. This is a substantial generalisation of the quotient for
deterministic modal transition systems defined in earlier papers
- …
