11,873 research outputs found
The game jam movement:disruption, performance and artwork
This paper explores the current conventions and intentions of the game jam - contemporary events that encourage the rapid, collaborative creation of game design prototypes. Game jams are often renowned for their capacity to encourage creativity and the development of alternative, innovative game designs. However, there is a growing necessity for game jams to continue to challenge traditional development practices through evolving new formats and perspectives to maintain the game jam as a disruptive, refreshing aspect of game development culture. As in other creative jam style events, a game jam is not only a process but also, an outcome. Through a discussion of the literature this paper establishes a theoretical basis with which to analyse game jams as disruptive, performative processes that result in original creative artefacts. In support of this, case study analysis of Development Cultures: a series of workshops that centred on innovation and new forms of practice through play, chance, and experimentation, is presented. The findings indicate that game jams can be considered as processes that inspire creativity within a community and that the resulting performances can be considered as a form of creative artefact, thus parallels can be drawn between game jams and performative and interactive art
Industrial laser welding: An evaluation
Report describes 10-kW laser welding system, designed to weld large structures made from 1/4-inch and 1/2-inch aluminum (2219) and D6AC steel
The Herschel SPIRE Fourier Transform Spectrometer Spectral Feature Finder II. Estimating Radial Velocity of SPIRE Spectral Observation Sources
The Herschel SPIRE FTS Spectral Feature Finder (FF) detects significant
spectral features within SPIRE spectra and employs two routines, and external
references, to estimate source radial velocity. The first routine is based on
the identification of rotational CO emission, the second cross-correlates
detected features with a line template containing most of the characteristic
lines in typical far infra-red observations. In this paper, we outline and
validate these routines, summarise the results as they pertain to the FF, and
comment on how external references were incorporated.Comment: 12 pages, 16 figures, 1 table, accepted by MNRAS March 202
Laser frequency comb techniques for precise astronomical spectroscopy
Precise astronomical spectroscopic analyses routinely assume that individual
pixels in charge-coupled devices (CCDs) have uniform sensitivity to photons.
Intra-pixel sensitivity (IPS) variations may already cause small systematic
errors in, for example, studies of extra-solar planets via stellar radial
velocities and cosmological variability in fundamental constants via quasar
spectroscopy, but future experiments requiring velocity precisions approaching
~1 cm/s will be more strongly affected. Laser frequency combs have been shown
to provide highly precise wavelength calibration for astronomical
spectrographs, but here we show that they can also be used to measure IPS
variations in astronomical CCDs in situ. We successfully tested a laser
frequency comb system on the Ultra-High Resolution Facility spectrograph at the
Anglo-Australian Telescope. By modelling the 2-dimensional comb signal recorded
in a single CCD exposure, we find that the average IPS deviates by <8 per cent
if it is assumed to vary symmetrically about the pixel centre. We also
demonstrate that series of comb exposures with absolutely known offsets between
them can yield tighter constraints on symmetric IPS variations from ~100
pixels. We discuss measurement of asymmetric IPS variations and absolute
wavelength calibration of astronomical spectrographs and CCDs using frequency
combs.Comment: 11 pages, 7 figures. Accepted for publication in MNRA
SN 2016iet: The Pulsational or Pair Instability Explosion of a Low Metallicity Massive CO Core Embedded in a Dense Hydrogen-Poor Circumstellar Medium
We present optical photometry and spectroscopy of SN 2016iet, an
unprecedented Type I supernova (SN) at with no obvious analog in the
existing literature. The peculiar light curve has two roughly equal brightness
peaks ( mag) separated by 100 days, and a subsequent slow decline
by 5 mag in 650 rest-frame days. The spectra are dominated by emission lines of
calcium and oxygen, with a width of only km s, superposed on a
strong blue continuum in the first year, and with a large ratio of at late times. There is no clear evidence
for hydrogen or helium associated with the SN at any phase. We model the light
curves with several potential energy sources: radioactive decay, central
engine, and circumstellar medium (CSM) interaction. Regardless of the model,
the inferred progenitor mass near the end of its life (i.e., CO core mass) is
M and up to M, placing the event in the
regime of pulsational pair instability supernovae (PPISNe) or pair instability
supernovae (PISNe). The models of CSM interaction provide the most consistent
explanation for the light curves and spectra, and require a CSM mass of
M ejected in the final decade before explosion. We further
find that SN 2016iet is located at an unusually large offset ( kpc) from
its low metallicity dwarf host galaxy ( Z, M), supporting the PPISN/PISN interpretation. In the final
spectrum, we detect narrow H emission at the SN location, likely due to
a dim underlying galaxy host or an H II region. Despite the overall consistency
of the SN and its unusual environment with PPISNe and PISNe, we find that the
inferred properties of SN\,2016iet challenge existing models of such events.Comment: 26 Pages, 17 Figures, Submitted to Ap
Tradition and Prudence in Locke's Exceptions to Toleration
Why did Locke exclude Catholics and atheists from toleration? Not, I contend, because he was trapped by his context, but because his prudential approach and practica ljudgments led him to traditiona ltexts. I make this argumentfirst by outlining the connections among prudential exceptionality, practical judgments, and traditional texts. I then describe important continuities betweenc onventional English understandings of the relationship between state and religion and Locke's writings on toleration, discuss Locke's conception of rights, and illustrate his use of prudential exceptions and distinctions. I conclude by arguing that Locke's problems are relevant to assessingc ontemporary liberal discussions of tolerationa nd the separation of state and religion that lean heavily on practical justification
The development of a new measure of quality of life in the management of gastro-oesophageal reflux disease: the Reflux Questionnaire.
INTRODUCTION
This paper reports on the development of a new measure of health-related quality of life for use among patients with gastro-oesophageal reflux disease (GORD), funded as part of the REFLUX trial. This is a large UK multi centre trial that aims to compare the clinical and cost effectiveness of minimal access surgery with best medical treatment for patients with GORD within the NHS.
Method Potential items were identified via a series of interviews and focus groups carried out with patients who were receiving/had received medical or surgical treatment for GORD. The final measure consisted of 31 items covering 7 categories (Heartburn; Acid reflux; Wind; Eating and swallowing; Bowel movements; Sleep; Work, physical and social activities). The measure produced two outputs: a quality of life score (RQLS) and five Reflux symptom scores. Reliability (internal consistency), criterion validity with the SF-36 and, sensitivity to change in terms of relationship with reported change in prescribed medication were assessed amongst a sample of 794 patients recruited into the trial.
RESULTS
The measure was shown to be internally consistent, to show criterion validity with the SF-36 and sensitive to changes in patients use of prescribed medication at baseline and 3 month follow-up.
DISCUSSION
The Reflux questionnaire is a new self-administered questionnaire for use amongst patients with GORD. Initial findings suggest that the new measure is valid, reliable, acceptable to respondents and simple to administer in both a clinical and research context
Global parameter search reveals design principles of the mammalian circadian clock
Background: Virtually all living organisms have evolved a circadian (~24 hour) clock that controls physiological and behavioural processes with exquisite precision throughout the day/night cycle. The suprachiasmatic nucleus (SCN), which generates these ~24 h rhythms in mammals, consists of
several thousand neurons. Each neuron contains a gene-regulatory network generating molecular oscillations, and the individual neuron oscillations are synchronised by intercellular coupling, presumably via neurotransmitters. Although this basic mechanism is currently accepted and has
been recapitulated in mathematical models, several fundamental questions about the design principles of the SCN remain little understood. For example, a remarkable property of the SCN is that the phase of the SCN rhythm resets rapidly after a 'jet lag' type experiment, i.e. when the light/ dark (LD) cycle is abruptly advanced or delayed by several hours.
Results: Here, we describe an extensive parameter optimization of a previously constructed simplified model of the SCN in order to further understand its design principles. By examining the top 50 solutions from the parameter optimization, we show that the neurotransmitters' role in generating the molecular circadian rhythms is extremely important. In addition, we show that when
a neurotransmitter drives the rhythm of a system of coupled damped oscillators, it exhibits very robust synchronization and is much more easily entrained to light/dark cycles. We were also able to recreate in our simulations the fast rhythm resetting seen after a 'jet lag' type experiment.
Conclusion: Our work shows that a careful exploration of parameter space for even an extremely simplified model of the mammalian clock can reveal unexpected behaviours and non-trivial predictions. Our results suggest that the neurotransmitter feedback loop plays a crucial role in the
robustness and phase resetting properties of the mammalian clock, even at the single neuron level
Conversion and Extraction of Insoluble Organic Materials in Meteorites
We endeavor to develop and implement methods in our laboratory to convert and extract insoluble organic materials (IOM) from low car-bon bearing meteorites (such as ordinary chondrites) and Precambrian terrestrial rocks for the purpose of determining IOM structure and prebiotic chemistries preserved in these types of samples. The general scheme of converting and extracting IOM in samples is summarized in Figure 1. First, powdered samples are solvent extracted in a micro-Soxhlet apparatus multiple times using solvents ranging from non-polar to polar (hexane - non-polar, dichloromethane - non-polar to polar, methanol - polar protic, and acetonitrile - polar aprotic). Second, solid residue from solvent extractions is processed using strong acids, hydrochloric and hydrofluoric, to dissolve minerals and isolate IOM. Third, the isolated IOM is subjected to both thermal (pyrolysis) and chemical (oxidation) degradation to release compounds from the macromolecular material. Finally, products from oxidation and pyrolysis are analyzed by gas chromatography - mass spectrometry (GCMS). We are working toward an integrated method and analysis scheme that will allow us to determine prebiotic chemistries in ordinary chondrites and Precambrian terrestrial rocks. Powerful techniques that we are including are stepwise, flash, and gradual pyrolysis and ruthenium tetroxide oxidation. More details of the integrated scheme will be presented
Formal change impact analyses for emulated control software
Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly
- …
