5,409 research outputs found

    Tackling the Global NCD Crisis: Innovations in Law and Governance

    Get PDF
    35 million people die annually of non-communicable diseases (NCDs), 80% of them in low- and middle-income countries—representing a marked epidemiological transition from infectious to chronic diseases and from richer to poorer countries. The total number of NCDs is projected to rise by 17% over the coming decade, absent significant interventions. The NCD epidemic poses unique governance challenges: the causes are multifactorial, the affected populations diffuse, and effective responses require sustained multi-sectorial cooperation. The authors propose a range of regulatory options available at the domestic level, including stricter food labeling laws, regulation of food advertisements, tax incentives for healthy lifestyle choices, changes to the built environment, and direct regulation of food and drink producers. Given the realities of globalization, such interventions require global cooperation. In 2011, the UN General Assembly held a High-level meeting on NCDs, setting a global target of a 25% reduction in premature mortality from NCDs by 2025. Yet concrete plans and resource commitments for reaching this goal are not yet in the offing, and the window is rapidly closing for achieving these targets through prevention--as opposed to treatment, which is more costly. Innovative global governance for health is urgently needed to engage private industry and civil society in the global response to the NCD crisis

    Stop Decay with LSP Gravitino in the final state: t~1→G~ W b\tilde{t}_1\to\widetilde{G}\,W\,b

    Full text link
    In MSSM scenarios where the gravitino is the lightest supersymmetric particle (LSP), and therefore a viable dark matter candidate, the stop t~1\tilde{t}_1 could be the next-to-lightest superpartner (NLSP). For a mass spectrum satisfying: mG~+mt>mt~1>mG~+mb+mWm_{\widetilde{G}}+m_t>m_{\tilde{t}_1}>m_{\widetilde{G}}+m_b+m_W, the stop decay is dominated by the 3-body mode t~1→b W G~\tilde{t}_1\rightarrow b\,W\,\tilde{G}. We calculate the stop life-time, including the full contributions from top, sbottom and chargino as intermediate states. We also evaluate the stop lifetime for the case when the gravitino can be approximated by the goldstino state. Our analytical results are conveniently expressed using an expansion in terms of the intermediate state mass, which helps to identify the massless limit. In the region of low gravitino mass (mG~≪mt~1m_{\widetilde{G}}\ll m_{\tilde{t}_1}) the results obtained using the gravitino and goldstino cases turns out to be similar, as expected. However for higher gravitino masses mG~≲mt~1m_{\widetilde{G}} \lesssim m_{\tilde{t}_1} the results for the lifetime could show a difference of O(100)\%

    Radar-anomalous, high-altitude features on Venus

    Get PDF
    Over nearly all of the surface of Venus the reflectivity and emissivity at centimeter wavelengths are about 0.15 and 0.85 respectively. These values are consistent with moderately dense soils and rock populations, but the mean reflectivity is about a factor of 2 greater than that for the Moon and other terrestrial planets. Pettingill and Ford, using Pioneer Venus reflectivities and emissivities, found a number of anomalous features on Venus that showed much higher reflectivities and much lower emissivities with both values approaching 0.5. These include Maxwell Montes, a number of high regions in Aphrodite Terra and Beta Regio, and several isolated mountain peaks. Most of the features are at altitudes above the mean radius by 2 to 3 km or more. However, such features have been found in the Magellan data at low altitudes and the anomalies do not exist on all high structures, Maat Mons being the most outstanding example. A number of papers have been written that attempt to explain the phenomena in terms of the geochemistry balance of weathering effects on likely surface minerals. The geochemists have shown that the fundamentally basaltic surface would be stable at the temperatures and pressures of the mean radius in the form of magnetite, but would evolve to pyrite and/or pyrrhotite in the presence of sulfur-bearing compounds such as SO2. Pyrite will be stable at altitudes above 4 or 5 km on Venus. Although the geochemical arguments are rather compelling, it is vitally important to rationally look at other explanations for radar and radio emission measurements such as that presented by Tryka and Muhleman. The radar reflectivity values are retrieved from the raw Magellan backscatter measurements by fitting the Hagfors' radar scattering model in which a surface roughness parameters and a normal incidence electrical reflectivity are estimated. The assumptions of the theory behind the model must be considered carefully before the results can be believed. These include that the surface roughness exists only at horizontal scales large compared to the wavelength, the vertical deviations are gaussianly distributed, there is no shadowing, and that the reflection occurs at the interface of two homogeneous dielectric half-spaces. Probably all these conditions are violated at the anomalous features under discussion. The most important of these is the homogeneity of the near surface of Venus, particularly in highlands. Under the assumptions of the theory, all of the radio energy is reflected by the impedance jump at the very boundary. However, in heterogeneous soil some fraction of the illuminating energy is propagated into the soil and then scattered back out by impedance discontinuities such as rock, voids, and cracks. In light soils, the latter effect can overwhelm the scattering effects of the true surface and greatly enhance the backscatter power, suggesting a much higher value of an effective dielectric constant that would be estimated from Hagfors' model

    Radar Investigation of Mars, Mercury, and Titan

    Get PDF
    Radar astronomy is the study of the surfaces and near surfaces of Solar System objects using active transmission of modulated radio waves and the detection of the reflected energy. The scientific goals of such experiments are surprisingly broad and include the study of surface slopes, fault lines, craters, mountain ranges, and other morphological structures. Electrical reflectivities contain information about surface densities and, to some extent, the chemical composition of the surface layers. Radar probes the subsurface layers to depths of the order of 10 wavelengths, providing geological mapping and determinations of the object’s spin state. Radar also allows one to study an object’s atmosphere and ionic layers as well as those of the interplanetary medium. Precise measurements of the time delay to surface elements provide topographic maps and powerful information on planetary motions and tests of gravitational theories such as general relativity. In this paper, we limit our discussion to surface and near-surface probing of Mercury, Mars, and Titan and review the work of the past decade, which includes fundamentally new techniques for Earth-based imaging. The most primitive experiments involve just the measurement of the total echo power from the object. The most sophisticated experiments would produce spatially resolved maps of the reflected power in all four Stokes’ parameters. Historically, the first experiments produced echoes from the Moon during the period shortly after World War II (see e.g. Evans 1962), but the subject did not really develop until the early 1960s when the radio equipment was sufficiently sensitive to detect echoes from Venus and obtain the first Doppler strip "maps" of that planet. The first successful planetary radar systems were the Continuous Wave (CW) radar at the Goldstone facility of the Caltech’s Jet Propulsion Laboratory and the pulse radar at the MIT Lincoln Laboratory. All of the terrestrial planets were successfully studied during the following decade, yielding the spin states of Venus and Mercury, a precise value of the astronomical unit, and a host of totally new discoveries concerning the surfaces of the terrestrial planets and the Moon. This work opened up at least a similar number of new questions. Although the early work was done at resolution scales on the order of the planetary radii, very rapid increases in system sensitivities improved the resolution to the order of 100 km, but always with map ambiguities. Recently, unambiguous resolution of 100 m over nearly the entire surface of Venus has been achieved from the Magellan spacecraft using a side-looking, synthetic aperture radar. Reviews of the work up to the Magellan era can be found in Evans (1962), Muhleman et al (1965), Evans & Hagfors (1968, see chapters written by G Pettengill, T Hagfors, and J Evans), and Ostro (1993). The radar study of Venus from the Magellan spacecraft was a tour de force and is well described in special issues of Science (volume 252, April 12, 1991) and in the Journal of Geophysical Research (volume 97, August 25 and October 25, 1992). Venus will not be considered in this paper even though important polarization work on that planet continues at Arecibo, Goldstone, and the Very Large Array (VLA). In this paper we review the most recent work in Earth-based radar astronomy using new techniques of Earth rotation, super synthesis at the VLA in New Mexico (operated by the National Radio Astronomy Observatory), and the recently developed "long-code" techniques at the Arecibo Observatory in Puerto Rico (operated by Cornell University). [Note: It was recently brought to our attention that the VLA software "doubles" the flux density of their primary calibrators. Consequently, it is necessary to half the radar power and reflectivity numerical values in all of our published radar results from the VLA/Goldstone radar.] The symbiotic relationship in these new developments for recent advances in our understanding of Mercury and Mars is remarkable. VLA imaging provides for the first time, unambiguous images of an entire hemisphere of a planet and the long-code technique makes it possible to map Mars and Mercury using the traditional range-gated Doppler strip mapping procedure [which was, apparently, developed theoretically at the Lincoln Laboratory by Paul Green, based on a citation in Evans (1962)]. Richard Goldstein was the first to obtain range-gated planetary maps of Venus as reported in Carpenter & Goldstein (1963). Such a system was developed earlier for the Moon as reported by Pettengill (1960) and Pettengill & Henry (1962). We first discuss the synthesis mapping technique

    SUPPLY AND DEMAND ISSUES FOR A CONVENIENCE LEARNING COURSE

    Get PDF
    This paper explores and analyzes supply and demand of university-level convenience learning courses. Procedures involve use of microeconomic theory to conceptually analyze supply, demand, benefits and costs and a case study comparison of a traditional course to a convenience learning course that has been offered for three years.Teaching/Communication/Extension/Profession,

    Measuring Consensus in Binary Forecasts: NFL Game Predictions

    Get PDF
    Previous research on defining and measuring consensus (agreement) among forecasters has been concerned with evaluation of forecasts of continuous variables. This previous work is not relevant when the forecasts involve binary decisions: up-down or win-lose. In this paper we use Cohen¡¯s kappa coefficient, a measure of inter-rater agreement involving binary choices, to evaluate forecasts of National Football League games. This statistic is applied to the forecasts of 74 experts and 31 statistical systems that predicted the outcomes of games during two NFL seasons. We conclude that the forecasters, particularly the systems, displayed significant levels of agreement and that levels of agreement in picking game winners were higher than in picking against the betting line. There is greater agreement among statistical systems in picking game winners or picking winners against the line as the season progresses, but no change in levels of agreement among experts. High levels of consensus among forecasters are associated with greater accuracy in picking game winners, but not in picking against the line.binary forecasts, NFL, agreement, consensus, kappa coefficient

    Parameterization of Tensor Network Contraction

    Get PDF
    We present a conceptually clear and algorithmically useful framework for parameterizing the costs of tensor network contraction. Our framework is completely general, applying to tensor networks with arbitrary bond dimensions, open legs, and hyperedges. The fundamental objects of our framework are rooted and unrooted contraction trees, which represent classes of contraction orders. Properties of a contraction tree correspond directly and precisely to the time and space costs of tensor network contraction. The properties of rooted contraction trees give the costs of parallelized contraction algorithms. We show how contraction trees relate to existing tree-like objects in the graph theory literature, bringing to bear a wide range of graph algorithms and tools to tensor network contraction. Independent of tensor networks, we show that the edge congestion of a graph is almost equal to the branchwidth of its line graph

    Teacher Perceptions of the American School Counselor Association’s National Model in an Urban Setting

    Get PDF
    The development of the ASCA’s National Standards and Model has helped define the profession and provided a framework for school counselors to implement in designing a program. Despite recent clarity in the school counseling profession, barriers still exist, especially in urban settings. As collaborators, teachers perceptions were measured in regards to urban school counselors implementing ASCA’s Model and its components (Elements/ Themes). Overall, results showed that teachers were in favor of the ASCA National Model and its components. Teacher’s gender and number of years teaching did not significantly influence responses to survey questions. Despite high perceptions of the model, more research needs to be conducted in urban schools to determine if this model is practical and feasible
    • …
    corecore