4,244 research outputs found

    Slum health: diseases of neglected populations.

    Get PDF
    BackgroundUrban slums, like refugee communities, comprise a social cluster that engenders a distinct set of health problems. With 1 billion people currently estimated to live in such communities, this neglected population has become a major reservoir for a wide spectrum of health conditions that the formal health sector must deal with.DiscussionUnlike what occurs with refugee populations, the formal health sector becomes aware of the health problems of slum populations relatively late in the course of their illnesses. As such, the formal health sector inevitably deals with the severe and end-stage complications of these diseases at a substantially greater cost than what it costs to manage non-slum community populations. Because of the informal nature of slum settlements, and cultural, social, and behavioral factors unique to the slum populations, little is known about the spectrum, burden, and determinants of illnesses in these communities that give rise to these complications, especially of those diseases that are chronic but preventable. In this article, we discuss observations made in one slum community of 58,000 people in Salvador, the third largest city in Brazil, to highlight the existence of a spectrum and burden of chronic illnesses not likely to be detected by the formal sector health services until they result in complications or death. Lack of health-related data from slums could lead to inappropriate and unrealistic allocation of health care resources by the public and private providers. Similar misassumptions and misallocations are likely to exist in other nations with large urban slum populations.SummaryContinued neglect of ever-expanding urban slum populations in the world could inevitably lead to greater expenditure and diversion of health care resources to the management of end-stage complications of diseases that are preventable. A new approach to health assessment and characterization of social-cluster determinants of health in urban slums is urgently needed

    Discriminants, symmetrized graph monomials, and sums of squares

    Full text link
    Motivated by the necessities of the invariant theory of binary forms J. J. Sylvester constructed in 1878 for each graph with possible multiple edges but without loops its symmetrized graph monomial which is a polynomial in the vertex labels of the original graph. In the 20-th century this construction was studied by several authors. We pose the question for which graphs this polynomial is a non-negative resp. a sum of squares. This problem is motivated by a recent conjecture of F. Sottile and E. Mukhin on discriminant of the derivative of a univariate polynomial, and an interesting example of P. and A. Lax of a graph with 4 edges whose symmetrized graph monomial is non-negative but not a sum of squares. We present detailed information about symmetrized graph monomials for graphs with four and six edges, obtained by computer calculations

    A Perspective on the Potential Role of Neuroscience in the Court

    Get PDF
    This Article presents some lessons learned while offering expert testimony on neuroscience in courts. As a biomedical investigator participating in cutting-edge research with clinical and mentoring responsibilities, Dr. Ruben Gur, Ph.D., became involved in court proceedings rather late in his career. Based on the success of Dr. Gur and other research investigators of his generation, who developed and validated advanced methods for linking brain structure and function to behavior, neuroscience findings and procedures became relevant to multiple legal issues, especially related to culpability and mitigation. Dr. Gur found himself being asked to opine in cases where he could contribute expertise on neuropsychological testing and structural and functional neuroimaging. Most of his medical-legal consulting experience has been in capital cases because of the elevated legal requirement for thorough mitigation investigations in such cases, and his limited availability due to his busy schedule as a full-time professor and research investigator who runs the Brain and Behavior Lab at the University of Pennsylvania (“Penn”). Courtroom testimony, however, has not been a topic of his research and so he has not published extensively on the issues in peer-reviewed literature

    Properly colored Hamilton cycles in edge-colored complete graphs

    Get PDF

    Scanning Electrochemical Microscopy of DNA Monolayers Modified with Nile Blue

    Get PDF
    Scanning electrochemical microscopy (SECM) is used to probe long-range charge transport (CT) through DNA monolayers containing the redox-active Nile Blue (NB) intercalator covalently affixed at a specific location in the DNA film. At substrate potentials negative of the formal potential of covalently attached NB, the electrocatalytic reduction of Fe(CN)63− generated at the SECM tip is observed only when NB is located at the DNA/solution interface; for DNA films containing NB in close proximity to the DNA/electrode interface, the electrocatalytic effect is absent. This behavior is consistent with both rapid DNA-mediated CT between the NB intercalator and the gold electrode as well as a rate-limiting electron transfer between NB and the solution phase Fe(CN)63−. The DNA-mediated nature of the catalytic cycle is confirmed through sequence-specific and localized detection of attomoles of TATA-binding protein, a transcription factor that severely distorts DNA upon binding. Importantly, the strategy outlined here is general and allows for the local investigation of the surface characteristics of DNA monolayers both in the absence and in the presence of DNA binding proteins. These experiments highlight the utility of DNA-modified electrodes as versatile platforms for SECM detection schemes that take advantage of CT mediated by the DNA base pair stack

    Parameterized Study of the Test Cover Problem

    Full text link
    We carry out a systematic study of a natural covering problem, used for identification across several areas, in the realm of parameterized complexity. In the {\sc Test Cover} problem we are given a set [n]={1,...,n}[n]=\{1,...,n\} of items together with a collection, T\cal T, of distinct subsets of these items called tests. We assume that T\cal T is a test cover, i.e., for each pair of items there is a test in T\cal T containing exactly one of these items. The objective is to find a minimum size subcollection of T\cal T, which is still a test cover. The generic parameterized version of {\sc Test Cover} is denoted by p(k,n,T)p(k,n,|{\cal T}|)-{\sc Test Cover}. Here, we are given ([n],T)([n],\cal{T}) and a positive integer parameter kk as input and the objective is to decide whether there is a test cover of size at most p(k,n,T)p(k,n,|{\cal T}|). We study four parameterizations for {\sc Test Cover} and obtain the following: (a) kk-{\sc Test Cover}, and (nk)(n-k)-{\sc Test Cover} are fixed-parameter tractable (FPT). (b) (Tk)(|{\cal T}|-k)-{\sc Test Cover} and (logn+k)(\log n+k)-{\sc Test Cover} are W[1]-hard. Thus, it is unlikely that these problems are FPT

    On Deterministic Sketching and Streaming for Sparse Recovery and Norm Estimation

    Full text link
    We study classic streaming and sparse recovery problems using deterministic linear sketches, including l1/l1 and linf/l1 sparse recovery problems (the latter also being known as l1-heavy hitters), norm estimation, and approximate inner product. We focus on devising a fixed matrix A in R^{m x n} and a deterministic recovery/estimation procedure which work for all possible input vectors simultaneously. Our results improve upon existing work, the following being our main contributions: * A proof that linf/l1 sparse recovery and inner product estimation are equivalent, and that incoherent matrices can be used to solve both problems. Our upper bound for the number of measurements is m=O(eps^{-2}*min{log n, (log n / log(1/eps))^2}). We can also obtain fast sketching and recovery algorithms by making use of the Fast Johnson-Lindenstrauss transform. Both our running times and number of measurements improve upon previous work. We can also obtain better error guarantees than previous work in terms of a smaller tail of the input vector. * A new lower bound for the number of linear measurements required to solve l1/l1 sparse recovery. We show Omega(k/eps^2 + klog(n/k)/eps) measurements are required to recover an x' with |x - x'|_1 <= (1+eps)|x_{tail(k)}|_1, where x_{tail(k)} is x projected onto all but its largest k coordinates in magnitude. * A tight bound of m = Theta(eps^{-2}log(eps^2 n)) on the number of measurements required to solve deterministic norm estimation, i.e., to recover |x|_2 +/- eps|x|_1. For all the problems we study, tight bounds are already known for the randomized complexity from previous work, except in the case of l1/l1 sparse recovery, where a nearly tight bound is known. Our work thus aims to study the deterministic complexities of these problems

    How does an interacting many-body system tunnel through a potential barrier to open space?

    Get PDF
    The tunneling process in a many-body system is a phenomenon which lies at the very heart of quantum mechanics. It appears in nature in the form of alpha-decay, fusion and fission in nuclear physics, photoassociation and photodissociation in biology and chemistry. A detailed theoretical description of the decay process in these systems is a very cumbersome problem, either because of very complicated or even unknown interparticle interactions or due to a large number of constitutent particles. In this work, we theoretically study the phenomenon of quantum many-body tunneling in a more transparent and controllable physical system, in an ultracold atomic gas. We analyze a full, numerically exact many-body solution of the Schr\"odinger equation of a one-dimensional system with repulsive interactions tunneling to open space. We show how the emitted particles dissociate or fragment from the trapped and coherent source of bosons: the overall many-particle decay process is a quantum interference of single-particle tunneling processes emerging from sources with different particle numbers taking place simultaneously. The close relation to atom lasers and ionization processes allows us to unveil the great relevance of many-body correlations between the emitted and trapped fractions of the wavefunction in the respective processes.Comment: 18 pages, 4 figures (7 pages, 2 figures supplementary information

    Attosecond time-resolved photoelectron holography

    Get PDF
    Ultrafast strong-field physics provides insight into quantum phenomena that evolve on an attosecond time scale, the most fundamental of which is quantum tunneling. The tunneling process initiates a range of strong field phenomena such as high harmonic generation (HHG), laser-induced electron diffraction, double ionization and photoelectron holography—all evolving during a fraction of the optical cycle. Here we apply attosecond photoelectron holography as a method to resolve the temporal properties of the tunneling process. Adding a weak second harmonic (SH) field to a strong fundamental laser field enables us to reconstruct the ionization times of photoelectrons that play a role in the formation of a photoelectron hologram with attosecond precision. We decouple the contributions of the two arms of the hologram and resolve the subtle differences in their ionization times, separated by only a few tens of attoseconds
    corecore