801 research outputs found
Will the Scottish Cancer Target for the year 2000 be met? The use of cancer registration and death records to predict future cancer incidence and mortality in Scotland.
Cancer mortality data reflect disease incidence and the effectiveness of treatment. Incidence data, however, reflect the burden of disease in the population and indicate the need for prevention measures, diagnostic services and cancer treatment facilities. Monitoring of targets mandates that both be considered. The Scottish Cancer Target, established in 1991, proposed that a reduction of 15% in mortality from cancer in the under-65s should be achieved between 1986 and 2000. Each year in Scotland approximately 8300 persons under 65 are diagnosed with cancer and 4500 die from the disease. The most common malignancies, in terms of both incident cases and deaths, in the under-65s, are lung and large bowel cancer in males, and breast, large bowel and lung cancer in females. A decrease of 6% in the number of cancer cases diagnosed in males under 65 is predicted between 1986 and 2000, whereas the number of cases in females in the year 2000 is expected to remain at the 1986 level. In contrast, substantial reductions in mortality are expected for both sexes: 17% and 25% in males and females respectively. Demographic changes will influence the numbers of cancer cases and deaths in the Scottish population in the year 2000. However, long-term trends in the major risk factors, such as smoking, are likely to be the most important determinants of the future cancer burden
Unpacking the thinking and making behind a slow technology research product with slow game
Motivated by prior work on everyday creativity, we adopt a design-oriented approach seeks to move beyond designing for explicit interactions to also include the implicit, incremental and, at times even, unknowing encounters that slowly emerge among people, technologies, and artifacts over time. We contribute an investigation into designing for slowness grounded in the practice of making a design artifact called Slow Game. We offer a detailed critical-reflective accounting of our process of making Slow Game into a research product. In attending to key design moves across our process, we reveal hidden challenges in designing slow technology research products and discuss how our findings can be mobilized in future work
Molecular epidemiology of imported cases of leishmaniasis in Australia from 2008 to 2014
© 2015 Roberts et al. Leishmaniasis is a vector borne disease caused by protozoa of the genus Leishmania. Human leishmaniasis is not endemic in Australia though imported cases are regularly encountered. This study aimed to provide an update on the molecular epidemiology of imported leishmaniasis in Australia. Of a total of 206 biopsies and bone marrow specimens submitted to St Vincent's Hospital Sydney for leishmaniasis diagnosis by PCR, 55 were found to be positive for Leishmania DNA. All PCR products were subjected to restriction fragment length polymorphism analysis for identification of the causative species. Five Leishmania species/species complexes were identified with Leishmania tropica being the most common (30/55). Travel or prior residence in a Leishmania endemic region was the most common route of acquisition with ∼47% of patients having lived in or travelled to Afghanistan. Cutaneous leishmaniasis was the most common manifestation (94%) with only 3 cases of visceral leishmaniasis and no cases of mucocutaneous leishmaniasis encountered. This report indicates that imported leishmaniasis is becoming increasingly common in Australia due to an increase in global travel and immigration. As such, Australian clinicians must be made aware of this trend and consider leishmaniasis in patients with suspicious symptoms and a history of travel in endemic areas. This study also discusses the recent identification of a unique Leishmania species found in native kangaroos and a potential vector host which could create the opportunity for the establishment of a local transmission cycle within humans
Manipulation and removal of defects in spontaneous optical patterns
Defects play an important role in a number of fields dealing with ordered
structures. They are often described in terms of their topology, mutual
interaction and their statistical characteristics. We demonstrate theoretically
and experimentally the possibility of an active manipulation and removal of
defects. We focus on the spontaneous formation of two-dimensional spatial
structures in a nonlinear optical system, a liquid crystal light valve under
single optical feedback. With increasing distance from threshold, the
spontaneously formed hexagonal pattern becomes disordered and contains several
defects. A scheme based on Fourier filtering allows us to remove defects and to
restore spatial order. Starting without control, the controlled area is
progressively expanded, such that defects are swept out of the active area.Comment: 4 pages, 4 figure
Income Inequality Developments in the Great Recession
The Great Recession has increased concerns over the fairness of the distribution of wealth and income in many societies. Using data on eight advanced economies (Germany, Greece, Ireland, Italy, Slovakia, Spain, the United Kingdom, and United States) between 2007 and 2010, I show how the Great Recession affected income inequality in different countries and how families and the state tried to mitigate its impact - through redistributing income within households and through the tax and benefit system. In most countries redistribution within household, through the social safety net and through direct taxes has been largely successful in offsetting the effect on income inequality of increased earnings inequality caused by the rise in unemployment in this pre-austerity period. I discuss some policy lessons that emerge from the varying experiences of different countries
Methodological approaches to determining the marine radiocarbon reservoir effect
The marine radiocarbon reservoir effect is an offset in 14C age between contemporaneous organisms from the terrestrial environment and organisms that derive their carbon from the marine environment. Quantification of this effect is of crucial importance for correct calibration of the <sup>14</sup>C ages of marine-influenced samples to the calendrical timescale. This is fundamental to the construction of archaeological and palaeoenvironmental chronologies when such samples are employed in <sup>14</sup>C analysis. Quantitative measurements of temporal variations in regional marine reservoir ages also have the potential to be used as a measure of process changes within Earth surface systems, due to their link with climatic and oceanic changes. The various approaches to quantification of the marine radiocarbon reservoir effect are assessed, focusing particularly on the North Atlantic Ocean. Currently, the global average marine reservoir age of surface waters, R(t), is c. 400 radiocarbon years; however, regional values deviate from this as a function of climate and oceanic circulation systems. These local deviations from R(t) are expressed as +R values. Hence, polar waters exhibit greater reservoir ages (δR = c. +400 to +800 <sup>14</sup>C y) than equatorial waters (δR = c. 0 <sup>14</sup>C y). Observed temporal variations in δR appear to reflect climatic and oceanographic changes. We assess three approaches to quantification of marine reservoir effects using known age samples (from museum collections), tephra isochrones (present onshore/offshore) and paired marine/terrestrial samples (from the same context in, for example, archaeological sites). The strengths and limitations of these approaches are evaluated using examples from the North Atlantic region. It is proposed that, with a suitable protocol, accelerator mass spectrometry (AMS) measurements on paired, short-lived, single entity marine and terrestrial samples from archaeological deposits is the most promising approach to constraining changes over at least the last 5 ky BP
SN 2005hj: Evidence for Two Classes of Normal-Bright SNe Ia and Implications for Cosmology
HET Optical spectra covering the evolution from about 6 days before to about
5 weeks after maximum light and the ROTSE-IIIb unfiltered light curve of the
"Branch-normal" Type Ia Supernova SN 2005hj are presented. The host galaxy
shows HII region lines at redshift of z=0.0574, which puts the peak unfiltered
absolute magnitude at a somewhat over-luminous -19.6. The spectra show weak and
narrow SiII lines, and for a period of at least 10 days beginning around
maximum light these profiles do not change in width or depth and they indicate
a constant expansion velocity of ~10,600 km/s. We analyzed the observations
based on detailed radiation dynamical models in the literature. Whereas delayed
detonation and deflagration models have been used to explain the majority of
SNe Ia, they do not predict a long velocity plateau in the SiII minimum with an
unvarying line profile. Pulsating delayed detonations and merger scenarios form
shell-like density structures with properties mostly related to the mass of the
shell, M_shell, and we discuss how these models may explain the observed SiII
line evolution; however, these models are based on spherical calculations and
other possibilities may exist. SN 2005hj is consistent with respect to the
onset, duration, and velocity of the plateau, the peak luminosity and, within
the uncertainties, with the intrinsic colors for models with M_shell=0.2 M_sun.
Our analysis suggests a distinct class of events hidden within the
Branch-normal SNe Ia. If the predicted relations between observables are
confirmed, they may provide a way to separate these two groups. We discuss the
implications of two distinct progenitor classes on cosmological studies
employing SNe Ia, including possible differences in the peak luminosity to
light curve width relation.Comment: ApJ accepted, 31 page
The Large Enriched Germanium Experiment for Neutrinoless Double Beta Decay (LEGEND)
The observation of neutrinoless double-beta decay (0)
would show that lepton number is violated, reveal that neutrinos are Majorana
particles, and provide information on neutrino mass. A discovery-capable
experiment covering the inverted ordering region, with effective Majorana
neutrino masses of 15 - 50 meV, will require a tonne-scale experiment with
excellent energy resolution and extremely low backgrounds, at the level of
0.1 count /(FWHMtyr) in the region of the signal. The
current generation Ge experiments GERDA and the MAJORANA DEMONSTRATOR
utilizing high purity Germanium detectors with an intrinsic energy resolution
of 0.12%, have achieved the lowest backgrounds by over an order of magnitude in
the 0 signal region of all 0
experiments. Building on this success, the LEGEND collaboration has been formed
to pursue a tonne-scale Ge experiment. The collaboration aims to develop
a phased 0 experimental program with discovery potential
at a half-life approaching or at years, using existing resources as
appropriate to expedite physics results.Comment: Proceedings of the MEDEX'17 meeting (Prague, May 29 - June 2, 2017
Anti-cancer effects and mechanism of actions of aspirin analogues in the treatment of glioma cancer
INTRODUCTION: In the past 25 years only modest advancements in glioma treatment have been made, with patient prognosis and median survival time following diagnosis only increasing from 3 to 7 months. A substantial body of clinical and preclinical evidence has suggested a role for aspirin in the treatment of cancer with multiple mechanisms of action proposed including COX 2 inhibition, down regulation of EGFR expression, and NF-κB signaling affecting Bcl-2 expression. However, with serious side effects such as stroke and gastrointestinal bleeding, aspirin analogues with improved potency and side effect profiles are being developed. METHOD: Effects on cell viability following 24 hr incubation of four aspirin derivatives (PN508, 517, 526 and 529) were compared to cisplatin, aspirin and di-aspirin in four glioma cell lines (U87 MG, SVG P12, GOS – 3, and 1321N1), using the PrestoBlue assay, establishing IC50 and examining the time course of drug effects. RESULTS: All compounds were found to decrease cell viability in a concentration and time dependant manner. Significantly, the analogue PN517 (IC50 2mM) showed approximately a twofold increase in potency when compared to aspirin (3.7mM) and cisplatin (4.3mM) in U87 cells, with similar increased potency in SVG P12 cells. Other analogues demonstrated similar potency to aspirin and cisplatin. CONCLUSION: These results support the further development and characterization of novel NSAID derivatives for the treatment of glioma
Observational and Physical Classification of Supernovae
This chapter describes the current classification scheme of supernovae (SNe).
This scheme has evolved over many decades and now includes numerous SN Types
and sub-types. Many of these are universally recognized, while there are
controversies regarding the definitions, membership and even the names of some
sub-classes; we will try to review here the commonly-used nomenclature, noting
the main variants when possible. SN Types are defined according to
observational properties; mostly visible-light spectra near maximum light, as
well as according to their photometric properties. However, a long-term goal of
SN classification is to associate observationally-defined classes with specific
physical explosive phenomena. We show here that this aspiration is now finally
coming to fruition, and we establish the SN classification scheme upon direct
observational evidence connecting SN groups with specific progenitor stars.
Observationally, the broad class of Type II SNe contains objects showing strong
spectroscopic signatures of hydrogen, while objects lacking such signatures are
of Type I, which is further divided to numerous subclasses. Recently a class of
super-luminous SNe (SLSNe, typically 10 times more luminous than standard
events) has been identified, and it is discussed. We end this chapter by
briefly describing a proposed alternative classification scheme that is
inspired by the stellar classification system. This system presents our
emerging physical understanding of SN explosions, while clearly separating
robust observational properties from physical inferences that can be debated.
This new system is quantitative, and naturally deals with events distributed
along a continuum, rather than being strictly divided into discrete classes.
Thus, it may be more suitable to the coming era where SN numbers will quickly
expand from a few thousands to millions of events.Comment: Extended final draft of a chapter in the "SN Handbook". Comments most
welcom
- …