309 research outputs found
Engaging with History after Macpherson
The Race Relations Amendment Act (2000) identifies a key role for education, and more specifically history, in promoting ârace equalityâ in Britain. In this article Ian Grosvenor and Kevin Myers consider the extent of young peopleâs current engagement with the history of âdiversity, change and immigrationâ which underpins the commitment to ârace equalityâ. Finding that in many of Britainâs schools and universities a singular and exclusionary version of history continues to dominate the curriculum, they go on to consider the reasons for the neglect of multiculturalism. The authors identify the development of an aggressive national identity that depends on the past for its legitimacy and argue that this sense of the past is an important obstacle to future progress
Recommended from our members
Queues don't matter when you can JUMP them!
QJUMP is a simple and immediately deployable approach
to controlling network interference in datacenter
networks. Network interference occurs when congestion
from throughput-intensive applications causes queueing
that delays traffic from latency-sensitive applications.
To mitigate network interference, QJUMP applies Internet
QoS-inspired techniques to datacenter applications.
Each application is assigned to a latency sensitivity level
(or class). Packets from higher levels are rate-limited
in the end host, but once allowed into the network can
âjump-the-queueâ over packets from lower levels. In settings
with known node counts and link speeds, QJUMP
can support service levels ranging from strictly bounded
latency (but with low rate) through to line-rate throughput
(but with high latency variance).
We have implemented QJUMP as a Linux Traffic Control
module. We show that QJUMP achieves bounded
latency and reduces in-network interference by up to
300Ă, outperforming Ethernet Flow Control (802.3x),
ECN (WRED) and DCTCP. We also show that QJUMP
improves average flow completion times, performing
close to or better than DCTCP and pFabric.This work was supported
by a Google Fellowship, EPSRC INTERNET Project
EP/H040536/1, Defense Advanced Research Projects
Agency (DARPA) and Air Force Research Laboratory
(AFRL), under contract FA8750-11-C-0249.This is the final published version. It first appeared at https://www.usenix.org/conference/nsdi15/technical-sessions/presentation/grosvenor
Biochemical enrichment and biophysical characterization of a taste receptor for L-arginine from the catfish, Ictalurus puntatus
BACKGROUND: The channel catfish, Ictalurus punctatus, is invested with a high density of cutaneous taste receptors, particularly on the barbel appendages. Many of these receptors are sensitive to selected amino acids, one of these being a receptor for L-arginine (L-Arg). Previous neurophysiological and biophysical studies suggested that this taste receptor is coupled directly to a cation channel and behaves as a ligand-gated ion channel receptor (LGICR). Earlier studies demonstrated that two lectins, Ricinus communis agglutinin I (RCA-I) and Phaseolus vulgaris Erythroagglutinin (PHA-E), inhibited the binding of L-Arg to its presumed receptor sites, and that PHA-E inhibited the L-Arg-stimulated ion conductance of barbel membranes reconstituted into lipid bilayers. RESULTS: Both PHA-E and RCA-I almost exclusively labeled an 82â84 kDa protein band of an SDS-PAGE of solubilized barbel taste epithelial membranes. Further, both rhodamine-conjugated RCA-I and polyclonal antibodies raised to the 82â84 kDa electroeluted peptides labeled the apical region of catfish taste buds. Because of the specificity shown by RCA-I, lectin affinity was chosen as the first of a three-step procedure designed to enrich the presumed LGICR for L-Arg. Purified and CHAPS-solubilized taste epithelial membrane proteins were subjected successively to (1), lectin (RCA-I) affinity; (2), gel filtration (Sephacryl S-300HR); and (3), ion exchange chromatography. All fractions from each chromatography step were evaluated for L-Arg-induced ion channel activity by reconstituting each fraction into a lipid bilayer. Active fractions demonstrated L-Arg-induced channel activity that was inhibited by D-arginine (D-Arg) with kinetics nearly identical to those reported earlier for L-Arg-stimulated ion channels of native barbel membranes reconstituted into lipid bilayers. After the final enrichment step, SDS-PAGE of the active ion channel protein fraction revealed a single band at 82â84 kDa which may be interpreted as a component of a multimeric receptor/channel complex. CONCLUSIONS: The data are consistent with the supposition that the L-Arg receptor is a LGICR. This taste receptor remains active during biochemical enrichment procedures. This is the first report of enrichment of an active LGICR from the taste system of vertebrata
The practical exploitation of tacit machine tool intelligence
Manufacturing equipment embraces an increasing measure of tacit intelligence, in both capacity and value. However, this intelligence is yet to be exploited effectively. This is due to both the costs and limitations of developed approaches and a deficient understanding of data value and data origin. This work investigates the principal limitations of typical machine tool data and encourages consideration of such inherent limitations in order to improve potential monitoring strategies. This work presents a novel approach to the acquisition and processing of machine tool cutting data. The approach considers the condition of the monitored system and the deterioration of cutting tool performance. The management of the cutting process by the machine tool controller forms the basis of the approach, and hence, makes use of the tacit intelligence that is deployed in such a task. By using available machine tool controller signals, the impact on day-to-day machining operations is minimised while avoiding the need to retrofit equipment or sensors. The potential of the approach in the contexts of the emerging internet of things and intelligent process management and monitoring is considered. The efficacy of the approach is evaluated by correlating the actively derived measure of process variation with an offline measurement of product form. The potential is then underlined through a series of experiments for which the derived variation is assessed as a direct measure of the cutting tool health. The proposed system is identified as both a viable alternative and synergistic addition to current approaches that mainly consider the form and features of the manufactured componen
Recommended from our members
Cancer survivors' experience with telehealth: A systematic review and thematic synthesis
Background: Net survival rates of cancer are increasing worldwide, placing a strain on health service provision. There is a drive to transfer the care of cancer survivorsâindividuals living with and beyond cancerâto the community and encourage them to play an active role in their own care. Telehealth, the use of technology in remote exchange of data and communication between patients and health care professionals (HCPs), is an important contributor to this evolving model of care. Telehealth interventions are âcomplex,â and understanding patient experiences of them is important in evaluating their impact. However, a wider view of patient experience is lacking as qualitative studies detailing cancer survivor engagement with telehealth are yet to be synthesized.
Objective: To systematically identify, appraise, and synthesize qualitative research evidence on the experiences of adult cancer survivors participating in telehealth interventions, to characterize the patient experience of telehealth interventions for this group.
Methods: Medline (PubMed), PsychINFO, Cumulative Index for Nursing and Allied Health Professionals (CINAHL), Embase, and Cochrane Central Register of Controlled Trials were searched on August 14, 2015, and March 8, 2016, for English-language papers published between 2006 and 2016. Inclusion criteria were as follows: adult cancer survivors aged 18 years and over, cancer diagnosis, experience of participating in a telehealth intervention (defined as remote communication or remote monitoring with an HCP delivered by telephone, Internet, or hand-held or mobile technology), and reporting qualitative data including verbatim quotes. An adapted Critical Appraisal Skill Programme (CASP) checklist for qualitative research was used to assess paper quality. The results section of each included article was coded line by line, and all papers underwent inductive analysis, involving comparison, reexamination, and grouping of codes to develop descriptive themes. Analytical themes were developed through an iterative process of reflection on, and interpretation of, the descriptive themes within and across studies.
Results: Across the 22 included papers, 3 analytical themes emerged, each with 3 descriptive subthemes: (1) influence of telehealth on the disrupted lives of cancer survivors (convenience, independence, and burden); (2) personalized care across physical distance (time, space, and the human factor); and (3) remote reassuranceâa safety net of health care professional connection (active connection, passive connection, and slipping through the net). Telehealth interventions represent a convenient approach, which can potentially minimize treatment burden and disruption to cancer survivorsâ lives. Telehealth interventions can facilitate an experience of personalized care and reassurance for those living with and beyond cancer; however, it is important to consider individual factors when tailoring interventions to ensure engagement promotes benefit rather than burden.
Conclusions: Telehealth interventions can provide cancer survivors with independence and reassurance. Future telehealth interventions need to be developed iteratively in collaboration with a broad range of cancer survivors to maximize engagement and benefit
Long-term remission of myopic choroidal neovascular membrane after treatment with ranibizumab: a case report
<p>Abstract</p> <p>Introduction</p> <p>Myopia has become a big public health problem in certain parts of the world. Sight-threatening complications like choroidal neovascularisation membranes occur in up to 10% of pathological myopia, and natural history studies show a trend towards progressive visual loss. There are long-term financial and quality-of-life implications in this group of patients, and treatment strategies should aim for long-term preservation of vision.</p> <p>Case presentation</p> <p>A 56-year-old Caucasian woman presented with a best-corrected visual acuity of 6/6-1 in her right eye and 6/24 in her left. Fundal examination revealed pathological myopia in both eyes and an elevated lesion associated with pre-retinal haemorrhage in the left macula. Ocular coherence tomography and fundus fluorescein angiogram confirmed a subfoveal classic choroidal neovascularisation membrane. The patient decided to proceed with intravitreal ranibizumab (0.5 mg) therapy. One month after treatment, best-corrected visual acuity improved to 6/12 in her left eye, with complete resolution subretinal fluid on ocular coherence tomography. After three months, best-corrected visual acuity further improved to 6/9, which was maintained up to 16 months post-treatment.</p> <p>Conclusion</p> <p>We suggest intravitreal ranibizumab as an alternative treatment for long-term remission of myopic choroidal neovascular membrane. It also suggests that myopic choroidal neovascularisation membranes may require fewer treatments to achieve sustained remission. Furthermore, this could serve as a feasible long-term management option if used in conjunction with ocular coherence tomography.</p
Development of aerosol activation in the double-moment Unified Model and evaluation with CLARIFY measurements
Representing the number and mass of cloud and aerosol particles independently in a climate, weather prediction or air quality model is important in order to simulate aerosol direct and indirect effects on radiation balance. Here we introduce the first configuration of the UK Met Office Unified Model in which both cloud and aerosol particles have âdouble-momentâ representations with prognostic number and mass. The GLObal Model of Aerosol Processes (GLOMAP) aerosol microphysics scheme, already used in the Hadley Centre Global Environmental Model version 3 (HadGEM3) climate configuration, is coupled to the Cloud AeroSol Interacting Microphysics (CASIM) cloud microphysics scheme. We demonstrate the performance of the new configuration in high-resolution simulations of a case study defined from the CLARIFY aircraft campaign in 2017 near Ascension Island in the tropical southern Atlantic. We improve the physical basis of the activation scheme by representing the effect of existing cloud droplets on the activation of new aerosol, and we also discuss the effect of unresolved vertical velocities. We show that neglect of these two competing effects in previous studies led to compensating errors but realistic droplet concentrations. While these changes lead only to a modest improvement in model performance, they reinforce our confidence in the ability of the model microphysics code to simulate the aerosolâcloud microphysical interactions it was designed to represent. Capturing these interactions accurately is critical to simulating aerosol effects on climate
Recommended from our members
At the extremes of exclusion: Deportation, detention and dispersal
Deportation, detention and dispersal have formed an occasional part of Britain's migration regime throughout the twentieth century, though they tended to be used in response to particular events or âcrisesâ. By the end of the twentieth century, however, deportation, detention and, most recently, dispersal have become ânormalizedâ, âessentialâ instruments in the ongoing attempt to control or manage immigration to Britain. This article outlines the use of detention, deportation and dispersal in the twentieth century exploring how they have evolved and then become an integral part of the migration regime into the twenty-first century. Where appropriate, British practices are compared with those of its European neighbours, where to differing degrees, deportation, detention and dispersal have also become everyday practices. In examining these practices in Britain, we consider the rationale and stated aims of their employment, as well as describing some of the consequences, where known, of detention, deportation and dispersal
Learning to Make Sense: Interdisciplinary Perspectives on Sensory Education and Embodied Enculturation
In this introductory essay we examine through a âtemporally inflected lensâ some of the complex entanglements of learning, senses, and sense making; body-sensory experience and practice; and culture and society. We thereby aim to bring into dialogue inter-/multisensorial approaches to education as a project and praxis and processes of âenculturationâ, which have always, in one way or the other, involved âembodiedâ learning (and imaginaries thereof), rather than mere âmental processingâ. We first situate the âturnâ to the senses, across a range of disciplinary fields, brought on by a growing interest in âmodes of meaning-makingâ, including the visual, aural, audio-visual, material, bodily, and spatial. Secondly, we investigate the explanatory potential of enculturation and embodiment as seemingly entangled notions. From this, we derive the concept of âembodied enculturationâ for the study of situated, historical entanglements of sensory learning and education. We link this proposed research paradigm to incisive scholarship on âcultural learningâ through sensorial lenses, after which we tease out six key questions or concerns emerging from a review of relevant, recent research. These key concerns help to contextualize state-of-the-art âsensuous education scholarshipâ introduced in the final section of the article and elaborated further in the ensuing contributions to this special issue
Reaping the benefits of digitisation:Pilot study exploring revenue generation from digitised collections through technological Innovation
In the last decade significant resources have been invested for the digitisation of the collections of a large number of museums and galleries worldwide. In Europe alone, 10 million EUR is annually invested in Europeana (Europeana 2014). However, as we gradually move on from âthe start-up phaseâ of digitisation (Hughes 2004), revenue generation and sustainability must be considered (Hughes 2004). Even beyond digitisation, generating revenue through innovation and in particular âfinding new business models to sustain fundingâ (Simon 2011) ranks amongst museumsâ top challenges (Simon 2011). More importantly, despite the significant wealth of digitised assets museums now own, little has been done to investigate ways these institutions could financially benefit from their digitised collections.
For art institutions in particular, this has been largely limited to the sale of image licenses, with the fear of losing this revenue posing as one of the key reasons art museums are reluctant to join the Open Content movement (Kapsalis 2016). This paper examines how recent technological advancements, such as image recognition and Print-on-Demand automation, can be utilised to take advantage of the wealth of digitised artworks museums and galleries have in their possession. A pilot study of the proposed solution at the State Museum of Contemporary Art (SMCA) in Thessaloniki, Greece, is covered and the findings are examined.
Early feedback indicates that there is a significant potential in the utilisation of the aforementioned technologies for the monetisation of digitised collections. However, challenges such as blending the real-world experience with the digital experience, as well as flattening the learning curve of the technological solution for museum visitors, need to be addressed. Based on the pilot study at SMCA, this paper investigates how emerging technologies can be utilised to facilitate revenue generation for all museums and galleries with digitised collections
- âŠ